Splunk Search

Creating a timeline showing when someone log out and login?

toaijala
Explorer

Hi, I'm quite new to splunk, but I'm able to create the needed fields and make basic reports. Timelines I don't know how to do.

I have a logfile basically looking simplified like this (it's all in one file)
timestamp, user, loginstatus
timestamp, user, action

e.g.
2016/10/12 09:13:18, john, login
2016/10/12 09:14:10, john, read_data
2016/10/12 09:15:20, john, write_data
2016/10/12 09:16:08, marc, login
2016/10/12 09:16:11, john, logout
2016/10/12 09:17:13, john, read_data
2016/10/12 09:20:13, marc, read_data
2016/10/12 09:21:18, marc, logout

Now lets assume I have a situation where johns account is doing stuff even if his status is logout as you can see in my example (the last entry by john ). How can I visualize in a timeline in Splunk?

I was thinking maybe a timeline where all users login and logout is a drawn line (for each) on the timeline and the other actions (read/write) are shown as dots/markers on the timeline with matching color to user, so it's easy to visualize that e.g. blue markers (actions) are outside the login period of user john (blue).

Any help in how to create this timeline?

1 Solution

toaijala
Explorer

Final answer that was satisfactory for me:

For visualization I used two timelines:

 index=_* OR index=* sourcetype=data1 login_status="login" OR login_status="login" | table _time username login_status
 index=_* OR index=* sourcetype=data1 action="read" OR laction="write" | table _time username action

And for for finding anomalies between logins and reporting them nicely:

index=* source=data1 | transaction user mvlist=t startswith="login_status=logoff" endswith="login_status=login" | eval
user=mvdedup(user) | table Event_time, user, actiontype, login_status | where actiontype="view" OR actiontype="write"

A big thank you to DEAD_BEEF and the rest for helping me finding a satisfactory solution.

View solution in original post

sideview
SplunkTrust
SplunkTrust

Streamstats is the command you want to use in a lot of tweaky cases, and the one that's relevant here is "I need to track a sort of last known value of "

streamstats (and it's cousin eventstats) are perfectly capable of doing this bookkeeping with a by username, so as to keep track of the last known value of loginstatus for each user.

Extra fun is had, in that streamstats marches through the events from the beginning of the results set (most recent) to the end, which would make it keep track of the "next" value not the previous value. The solution feels embarassingly clunky, and it's the reverse command.

This search will combine all that and filter out to just users and the actions they took, at times when they were ostensibly logged out:

index=* | reverse | streamstats last(login_status) as last_login_status by username | search last_login_status="logout"  | table _time timestamp user_name action last_login_status

toaijala
Explorer

Thank you this is also a very nice way of listing anomalities, it does lists all (non anomalities) off-duty lines aswell, but it can be fixed by adding this the end:

|where action="view" OR action="write"

This works very nicely to find anomalities, also if you want to count all anomalities instead of transactions and a way to find even a anomality line that happends when user has logout and has not yet login again.

0 Karma

toaijala
Explorer

Final answer that was satisfactory for me:

For visualization I used two timelines:

 index=_* OR index=* sourcetype=data1 login_status="login" OR login_status="login" | table _time username login_status
 index=_* OR index=* sourcetype=data1 action="read" OR laction="write" | table _time username action

And for for finding anomalies between logins and reporting them nicely:

index=* source=data1 | transaction user mvlist=t startswith="login_status=logoff" endswith="login_status=login" | eval
user=mvdedup(user) | table Event_time, user, actiontype, login_status | where actiontype="view" OR actiontype="write"

A big thank you to DEAD_BEEF and the rest for helping me finding a satisfactory solution.

sundareshr
Legend

See if this works

base search | eval marker=case(login_status="login", 1, login_status="logout", 2, 1=1, 0) | streamstats max(marker) as marker by user | where isnull(login_status) AND marker=2
0 Karma

toaijala
Explorer

I don't get any results trying to use this with the search.

0 Karma

DEAD_BEEF
Builder

Would you be open to the idea of having a query that shows you when someone has logged out but is still generating actions? I think having that query auto-update a report (say, every 30 mins) would be much easier than what you're proposing.

toaijala
Explorer

Hi,
That's a great idea. I gladly take any suggestions how to do it. I don't really do graps how to make a query to check something after a specfic action (value) has happend. So it would be a terriffic help. Even better if there is a way to check thing that has happend between login-logout or logout-login.

Any help is appriciated.

0 Karma

DEAD_BEEF
Builder

For the login-logout, what are you trying to display? The user's actions from when they logged in to when they logged out?

If so, my idea is to use a transaction that groups all logs together based on the username. It takes this big group and then looks for the login and logout loginstatus. The should create a table that lists the user, their login and logout timestamp, then the list of actions that they did. Once you can get this to work, then it's trivial to adjust the query to look for actions starting with a logout and ending with a login (actions happening when a user shouldn't be performing actions).

index=[your index] | transaction user mvlist=t startswith=eval(loginstatus="login") endswith=eval(loginstatus="logout") | eval user=mvdedup(user) | eval login_time=mvindex(timestamp, 0) | eval logout_time=mvindex(timestamp, -1) | stats login_time, logout_time, values(action) by user

If we can get this to work as intended, it is a good start before tackling the next few steps. Please update as appropriate.

0 Karma

toaijala
Explorer

Hi,
I tested your command and it groups nicely (when used partially), but it only list first login and last logout, when users have multiple of logins also it complained about login_time. Anyway thank you for the transaction command example it will sure help me in the future. However I found a vizualization solution that is statisfactory for now:

index=_* OR index=* sourcetype=data1 login_status="login" OR login_status="login" | table _time username login_status
index=_* OR index=* sourcetype=data1 action="read" OR laction="write" | table _time username action

They are two separate, but they work well enough (while it would be nice to combine them).

However what I would really need now is a way to show action related to users that happends after logout but before their possible next login and alert it as an anomality. e.g.

2016/10/12 09:13:18, john, login
2016/10/12 09:14:10, john, read_data
2016/10/12 09:15:20, john, write_data
2016/10/12 09:16:11, john, logout
2016/10/12 09:17:13, john, read_data <- ANOMALITY
2016/10/12 09:17:25, john, write_data <- ANOMALITY
2016/10/12 09:19:18, john, login
2016/10/12 09:19:20, john, write_data
2016/10/12 09:20:11, john, logout

Any ideas how to report/alert those animalities that happends after logout but before (possible) login would be great. No visualisation needed, only report/alert.

0 Karma

DEAD_BEEF
Builder

As you mentioned it was grabbing the earliest login and latest logout within your timeframe of the search because I didn't specify a timeframe within the transaction. Usually, you will see a maxspan=Xm to group the logs together that have occurred within X minutes of each other. But since we have no idea how often users may login/logout, you can't use it.

Well, I did a bit more research and testing this time with my own data hopefully this fixes the issue a little bit. This should group sets of events where the user logins then logs out. It should clump this group together and generate a table to list the user, login time, logout time, and list the actions that they did.

index=[your index] | transaction user mvlist=t startswith=login_status=login" endswith="login_status=logout" | eval user=mvdedup(user) | eval login_time=mvindex(timestamp, 0) | eval logout_time=mvindex(timestamp, -1) | table user, login_time, logout_time, action

If the above query worked, then we can edit it to list actions of a user where we start with a logout, end with a login, and display the "inbetween". Once that is working, we can send that to a report.

0 Karma

toaijala
Explorer

Hi, this works now, it seems to list everything except the ANOMALITIES (that are not between the correct a users login and logout). Any idea how can I pick/report/alert the anomalities?

0 Karma

DEAD_BEEF
Builder

Okay, just wanted to make sure it's working first. Now, instead of using the query to grab everything between login-logout, we will look for the anomalies which are between logout-login, correct? This query combs through your logs and put them in groups where it starts with logout and then ends at login. Then, it goes through the logs and lists every action that's there EXCEPT the first and last action, which are your logout and login. So, if the action table is empty, then no anomalies are there.

index=[your index] | transaction user mvlist=t startswith="login_status=logout" endswith="login_status=login" | eval user=mvdedup(user) | eval logout_time=mvindex(timestamp, 0) | eval login_time=mvindex(timestamp, -1) | eval anomaly=mvindex(action,1,mvcount(action)-2) | table user, logout_time, login_time, anomaly
0 Karma

toaijala
Explorer

It works! A huge thank you for your help and patience! So as said my data was a little more complex, actiontype and login_status are different fields

I'm using this query now (should had probably shown this more complex from the start):

index=* | transaction user mvlist=t startswith="login_status=logoff" endswith="login_status=login" | eval user=mvdedup(user) | eval logout_time=mvindex(timestamp, 0) | eval login_time=mvindex(timestamp, -1) | eval anomaly=mvindex(actiontype,0,mvcount(actiontype)) | table Event_time, user, anomaly, login_status

Now results look like this E.g.

time user actiontype login_status

2016/10/12 09:39:18 marc NULL login
2016/10/12 09:39:35 marc NULL logoff

2016/10/12 08:57:14 john NULL logoff
2016/10/12 11:50:01 john view NULL
2016/10/12 12:42:17 john write NULL
2016/10/12 12:50:54 john NULL login

2016/10/12 08:24:53 eric NULL login
2016/10/12 08:58:28 eric NULL logoff

I guess now all the lines with anomality=NULL need to be cleaned off, so that only thoselines with 'view' or 'write' in actiontype column remains. (I know your settings hid the NULL text but the records were on wrong line due to my multiple fields in real data). Any suggestion how to clean this results so that only:
2016/10/12 11:50:01 john view NULL
2016/10/12 12:42:17 john write NULL
remains?

0 Karma

DEAD_BEEF
Builder

Try this and see if it works:

index=* | transaction user mvlist=t startswith="login_status=logoff" endswith="login_status=login" | eval user=mvdedup(user) | eval logout_time=mvindex(timestamp, 0) | eval login_time=mvindex(timestamp, -1) | eval anomaly=mvindex(actiontype,1,mvcount(actiontype)-2) | table Event_time, user, anomaly, login_status
0 Karma

toaijala
Explorer

Hi,
Now the NULLs are gone but the anomality lines are one line too high up (actiontype is at wrong Event_time so that is bad) and all in-duty/off-duty lines are still printed (with Event_time) even if they are empty this:

Fri Jun 7 09:39:18 2013

Fri Jun 7 09:39:35 2013

Fri Jun 7 08:57:14 2013 view
Fri Jun 7 11:50:01 2013 write
Fri Jun 7 12:42:17 2013
Fri Jun 7 12:50:54 2013

Fri Jun 7 08:24:53 2013

Fri Jun 7 08:58:28 2013

0 Karma

lakromani
Builder

Try to change from startswith="login_status=logoff" endswith="login_status=login" to startswith="logoff" endswith="login". I was not able to use field parameters, but data from raw line seems to work.

0 Karma

toaijala
Explorer

Hi,
thank you for the suggestion, but unfortunately it really messes up the table for me and dosen't show anomalities anymore (everything is NULL in that column).

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...