Dashboards & Visualizations

Count an event multiple times (once per day) spanning a time range within the event

mccurity
Explorer

I'm trying to use timechart (which may be the wrong approach) to count events for each day that were "active" over a period of time.  For example the data would be:

usersessionfirst_seenlast_seen
user11372712020-09-13T00:39:40.079Z2020-09-24T00:56:30.941Z
user11372642020-09-13T13:17:10.052Z2020-09-25T13:19:37.342Z
user11372722020-09-13T13:48:24.513Z2020-09-25T13:27:27.663Z
user21372722020-09-16T02:45:28.436Z2020-09-24T13:21:27.215Z
user21372672020-09-18T13:03:01.847Z2020-09-25T13:18:05.927Z
user31372722020-09-13T13:04:52.235Z2020-09-25T13:07:02.422Z

 

Resulting in (for use in some ort of timechart like graph, or maybe even a bar chart):

Date (x axis)Count (y axis)
2020-09-134
2020-09-144
2020-09-154
2020-09-165
2020-09-175
.......
2020-09-246
2020-09-254

 

One though that came to mind was creating then expanding a multivalue field with a value for each day in between the first and last dates.  Thought I'm not sure how to even accomplish that if doable.  

I've also thought a couple time basing something on a calculated duration but may be challenging given the varying first and last times.

Or maybe there's a precanned app out there I'm not finding?

Labels (2)
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

OK this is a bit complicated, and possibly not very efficient, but it seems to give the results you want

-- your search
| eval firstday=strptime(first_seen,"%Y-%m-%d")
| eval lastday=strptime(last_seen,"%Y-%m-%d")
| eval interveningdays=round((lastday-firstday)/(60*60*24))+1
| streamstats count as row 
| streamstats sum(interveningdays) as activedays
| sort - activedays
| append [|makeresults count = 1|eval activedays=1| fields - _time]
| sort activedays
| makecontinuous activedays
| sort - activedays
| streamstats min(row) as row
| streamstats values(user) as user values(session) as session values(firstday) as firstday by row
| sort activedays
| streamstats count as userday by row
| eval _time=firstday+((userday - 1)*(60*60*24))
| stats count by _time

 

View solution in original post

mccurity
Explorer

Thanks.  I did run into issues with having too much data; a lot of users with a lot of sessions each running for months resulting in too many results.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You could try creating a summary index, say once a week, and then do you main query using this index. If a session is active, will there be at least one event every day? It might make life easier if there is.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

OK this is a bit complicated, and possibly not very efficient, but it seems to give the results you want

-- your search
| eval firstday=strptime(first_seen,"%Y-%m-%d")
| eval lastday=strptime(last_seen,"%Y-%m-%d")
| eval interveningdays=round((lastday-firstday)/(60*60*24))+1
| streamstats count as row 
| streamstats sum(interveningdays) as activedays
| sort - activedays
| append [|makeresults count = 1|eval activedays=1| fields - _time]
| sort activedays
| makecontinuous activedays
| sort - activedays
| streamstats min(row) as row
| streamstats values(user) as user values(session) as session values(firstday) as firstday by row
| sort activedays
| streamstats count as userday by row
| eval _time=firstday+((userday - 1)*(60*60*24))
| stats count by _time

 

Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...