Splunk Search

How do I get the number of daily logins over all days since the first event?

sharonmok
Path Finder

Sorry for the confusing title. I'm trying to find out:

1) How many days a user has logged on (doesn't matter how many times in a day, just one or more counts)
2) Average this over the number of days total that they've been in the system (today minus the days since the user first logged in ever)

I tried something like this, which I know is totally wrong because I can't even get all of the parts to run properly and go into the table.

| stats earliest(_time) as first_login by client_user
| eval days_since = (now() - first_login)/86400
| eval duration_since = tostring(now() - first_login, "duration")
| stats dc(_time) as Active_Days by client_user
| bucket _time span=1day 
| eval Average_Activity = Active_Days / duration_since
| table client_user, duration_since, Active_Days, Average_Activity

Thanks for any and all help! I have absolutely no coding background, I've been just trying to piece together answers from other Splunk questions into my query.

1 Solution

niketn
Legend

@sharonmok, please try the following search:

<yourBaseSearch>
| eventstats min(_time) as First_Login by client_user
| bin _time span=1d
| stats min(First_Login) as First_Login by _time client_user
| streamstats count as Active_Days by client_user
| reverse
| dedup client_user
| eval First_Login=strptime(strftime(First_Login,"%Y/%m/%d"),"%Y/%m/%d")
| eval Duration_Since=round((_time-First_Login)/86400)
| eval Average_Activity = Active_Days/Duration_Since
| fillnull value=0 Average_Activity
| fieldformat First_Login=strftime(First_Login,"%Y/%m/%d")

Following is a run anywhere search based on Splunk's _internal index on similar lines where component field is used instead of client_user:

index=_internal sourcetype=splunkd log_level!="INFO"
| eventstats min(_time) as First_Error by component
| bin _time span=1d
| stats min(First_Error) as First_Error by _time component
| streamstats count as Active_Days by component
| reverse
| dedup component
| eval First_Error=strptime(strftime(First_Error,"%Y/%m/%d"),"%Y/%m/%d")
| eval Duration_Since=round((_time-First_Error)/86400)
| eval Average_Activity = Active_Days/Duration_Since
| fillnull value=0 Average_Activity
| fieldformat First_Error=strftime(First_Error,"%Y/%m/%d")
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

View solution in original post

niketn
Legend

@sharonmok, please try the following search:

<yourBaseSearch>
| eventstats min(_time) as First_Login by client_user
| bin _time span=1d
| stats min(First_Login) as First_Login by _time client_user
| streamstats count as Active_Days by client_user
| reverse
| dedup client_user
| eval First_Login=strptime(strftime(First_Login,"%Y/%m/%d"),"%Y/%m/%d")
| eval Duration_Since=round((_time-First_Login)/86400)
| eval Average_Activity = Active_Days/Duration_Since
| fillnull value=0 Average_Activity
| fieldformat First_Login=strftime(First_Login,"%Y/%m/%d")

Following is a run anywhere search based on Splunk's _internal index on similar lines where component field is used instead of client_user:

index=_internal sourcetype=splunkd log_level!="INFO"
| eventstats min(_time) as First_Error by component
| bin _time span=1d
| stats min(First_Error) as First_Error by _time component
| streamstats count as Active_Days by component
| reverse
| dedup component
| eval First_Error=strptime(strftime(First_Error,"%Y/%m/%d"),"%Y/%m/%d")
| eval Duration_Since=round((_time-First_Error)/86400)
| eval Average_Activity = Active_Days/Duration_Since
| fillnull value=0 Average_Activity
| fieldformat First_Error=strftime(First_Error,"%Y/%m/%d")
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

sharonmok
Path Finder

Thank you! This did exactly what I wanted. If you don't mind me asking, how did you get a table to run without a table command?

0 Karma

niketn
Legend

@sharonmok table is a transforming command which should be used to present data in a proper sequence as expected. If you dont really need it(might have performed some other transforming command or retained only specific fields) you may not need to run a separate table command. This documentation will be a good to read stuff: http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Commandsbytype

The moment I ran a stats command in my query, I retained only the fields referred in that command and then built up the remaining fields as needed.

The reason why fieldformat has been used in the final pipe instead of eval for converting EPOCH time to String time, is that fieldformat retains the value as epoch and displays result as string time. Hence this field can be used further as epoch in subsequent pipes if required.

Hope these clarify things 🙂

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

sharonmok
Path Finder

Would you be able to help me out with one more thing? I'm trying to look at only business days, rather than all days. I've found this part of a query but I don't know where to put it in the original query to make it work.

| eval date_wday = tonumber(strftime(_time, "%w"))
| where date_wday>=1 AND date_wday<=5

0 Karma

sharonmok
Path Finder

Thanks!! That was very useful

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...