Splunk Search

How can I group counts by time ranges in a row?

OliverG91
Explorer

For a certain time range, I want to group together the counts in a single row, divided into equal time slices.
For example, for  "-15m" I want see 5-minute counts something like this:

index Last15MinCount Last10MinCount Last5MinCount
APP1     100                      123                               345
APP2    32                          55                                   60

The idea is for me to compare the Last5MinCount to the Avg of Last15MinCount and Last10MinCount. I could not find a suitable way of simplifying my query, but I got this instead (note: times shd have 'at'm; this forum links 'at' to members):

index=* earliest=-15m
| rex field=message "(?i)(?<ORG>[C]{0,1}+MS\w*)+(?i)\.(?<ENV>[dev|test|prod]+(-pci){0,1})\.+(?i)(?<APP>[\w-]+)"
| rex field=_raw ".*(?<level>LEVEL)[\s\S]{0,5}(?<code>FATAL|ERROR|WARN|DEBUG|INFO).*"
| eval time15=relative_time(now(), "-15m")
| eval time10=relative_time(now(), "-10m")
| eval time05=relative_time(now(), "-05m")
| eval time00=relative_time(now(), "-00m")
| eval etime=_time
| eval Time=case(tonumber(etime)>tonumber(time15) AND tonumber(etime) <= tonumber(time10), "Last15",
tonumber(etime)>tonumber(time10) AND tonumber(etime) <= tonumber(time05), "Last10",
tonumber(etime)>tonumber(time05) AND tonumber(etime) <= tonumber(time00), "Last05")
| stats count(eval(Time=="Last15E")) AS Last15
count(eval(Time=="Last10E")) AS Last10
count(eval(Time=="Last05E")) AS Last05
by APP

This gives me the desired rows. My question is about these lines:

| eval time15=relative_time(now(), "-15m")
| eval Time=case(tonumber(etime)>tonumber(time15) AND tonumber(etime) <= tonumber(time10), "Last15"
and
| stats count(eval(Time=="Last15E")) AS Last15
can probably be combined into one line, but I could not find the most apropriate function.

Any help in simplifying this would be appreciated. Thanks!

 

Labels (2)
0 Karma
1 Solution

tread_splunk
Splunk Employee
Splunk Employee
index=_internal 
| timechart count span=5m by sourcetype partial=f limit=0 
| streamstats count 
| eval timespan=case(count==1,"1.Last15MinCount",count==2,"2.Last10MinCount",count==3,"3.Last5MinCount") 
| table timespan,* 
| untable timespan,sourcetype,count 
| xyseries sourcetype,timespan,count

Is this along the right lines?  Run this query over the last 20minutes.  "partial=f" is useful on the timechart to suppress the first and last partial time bins.

View solution in original post

0 Karma

tread_splunk
Splunk Employee
Splunk Employee

I'm assuming you want Last15MinCount to be the number of events that arrived between 15 and 10mins ago, Last10MinCount to be the number of events that arrived between 10 and 5 mins ago, and Last5MinCount is self-explanatory.  Looks that way from your example.

0 Karma

tread_splunk
Splunk Employee
Splunk Employee
index=_internal 
| timechart count span=5m by sourcetype partial=f limit=0 
| streamstats count 
| eval timespan=case(count==1,"1.Last15MinCount",count==2,"2.Last10MinCount",count==3,"3.Last5MinCount") 
| table timespan,* 
| untable timespan,sourcetype,count 
| xyseries sourcetype,timespan,count

Is this along the right lines?  Run this query over the last 20minutes.  "partial=f" is useful on the timechart to suppress the first and last partial time bins.

0 Karma

OliverG91
Explorer

Yes, the Last15Mincount is the count bet 10-15min.

This definitely makes it a lot simpler, thank you very much!

0 Karma

OliverG91
Explorer

PS. I have some typos on the variable names, and the "| rex" lines are not relevant to the question. Please ignore those.

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...