Splunk Search

Getting concurrent connections from transactions

bekirk
Explorer
index=myindex "event=login" OR "event=logout"  
| transaction username startswith="event=login" endswith="event=logout" keepevicted=true
| eval start_event=mvindex(event,0)
| eval end_event=mvindex(event,1)
| eval start_time=mvindex(when,0)
| eval end_time=mvindex(when,1)
| eval end_time=if(isnull(end_time), now(), end_time)
| eval range = mvrange(start_time, end_time, 60)
| mvexpand range
| eval _time = range
| timechart span=15m dc(username) AS concurrent_users by username

 So the problem is that users that logged in prior to my search window but logged out continue to show up as constantly connectioned,  I tried to do something with if the only event was logout to make the start_time=earliest but that doesn't like that.  I have the end_time equal to now() if there is no end_time.
"event" is login or logout, "when" is the epoch time of the event, and username is what it sounds like.
So I would like to be able to handle the case when someone logged in before my search window(no login event), or the case when someone is currently logged in (no logout event); and obviously the case where someone has both a login and logout event, the only case I won't be able to handle is the if someone logged in before my time window and are still logged in, but the system has a 24hour timeout so I think we should be good if I have a window bigger than 24hours. Attached is a screenshot of the issue both users logged out on Friday, but continue to show logged in all week. 

Thank you,

Brian

Labels (2)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. I'm not sure what are your boundary conditions (for example - can the same user have multiple concurrent connections? that breaks transaction).

Also the last command seems strange for me - it should count distinct usernames by... username? Makes no sense.

Anyway, if the users cannot have multiple overlapping connections, I'd go a completely different way.

1. There is a built-in command "concurrency". (even if there wasn't there's a way to count concurrent connections with streamstats.

2. For a non-concurrent connections per user  it's probably easiest to do something like that (don't have my Splunk available at the moment so it's a "dry run" code).

index=myindex event IN (login,logout) ``` That's obvious ```
| eval login_time=if(event=="login",_time,null())
| eval logout_time=if(event=="logout",_time,null()) ```we create two artificial fields we'll soon use```
| streamstats current=f last(logout_time) as logout_time by username ```we copy the logout time to the
login event; remember that events are returned in reverse chronological order so we have logouts first```
| search event="login" ```we don't need logouts anymore
| eval duration=logout_time-login_time ```well, we could have skipped creating logout_time since it's equal
to _time anyway but this way it's more verbose```
| concurrency duration=duration start=login_time

That's a rough idea of how to approach this problem - carry over the logout time to the login event and then do your calculations.

You could also emulate the concurrency command differently - using additional field to carrying -1 for login and 1 for logout (if going through the events in default order) and do streamstats sum on those values.

0 Karma

bekirk
Explorer

Oh yeah and the system is supposed to keep users from multiple logons but I should test that because I am sure users will try.  

0 Karma

bekirk
Explorer

 

Thank you for your response.  I was just showing the username in the timechart for troubleshooting.  
My numbers look close to what the data is telling me I just was using mvexpand to show a session from
the start of the login to the logout.  I also lowered the mvrange step to 300  and the span to 5 minutes
just to see more granularity until I figure it out.  

Using the dashboard and time picker seemed promising however when I try to populate the start and end
times it seems works when I pick a specific date/time but if I do last 7 days this ends up in my SPL:

index=myindex event IN (login,logout)
| transaction username startswith="event=login" endswith="event=logout" keepevicted=true
| eval start_event=mvindex(event,0)
| eval end_event=mvindex(event,1)
| eval start_time=mvindex(when,0)
| eval end_time=mvindex(when,1)
| eval end_time=case(start_event=="logout", start_time, isnull(end_time), "now", true(), end_time)
| eval start_time=case(start_event=="logout", "-7d@d", true(), start_time)
| eval range = mvrange(start_time, end_time, 300)
| mvexpand range
| eval _time = range
| timechart span=5m dc(username) AS concurrent_users by username

I will continue to look at your, I initially had transactions and concurrency, I will let you know if I get it using that.  Thank you.

Brian

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Well, if your solution works, that's OK.

Just remember that transaction is a "heavy" command and is not recommended, especially on bigger data sets.

0 Karma

bekirk
Explorer
I got some additional logic built in however trying to get the earliest value of my search is where I am struggling I hard coded it just to make sure my logic worked:

index=myindex "event=login" OR "event=logout"
| transaction username startswith="event=login" endswith="event=logout" keepevicted=true
| eval start_event=mvindex(event,0)
| eval end_event=mvindex(event,1)
| eval start_time=mvindex(when,0)
| eval end_time=mvindex(when,1)
| eval end_time=case(start_event=="logout", start_time, isnull(end_time), now(), true(), end_time)
| eval start_time=case(start_event=="logout", now()-615900, true(), start_time)
| eval range = mvrange(start_time, end_time, 60)
| mvexpand range
| eval _time = range
| timechart span=1m dc(username) AS concurrent_users by username

 maybe I will try a dashboard so see if I can get it to work with time pickers if all else fails. 

 

Thank you,
Brian

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...