Splunk Search

Weekly Average Append Subsearch Optimization

jasonmadesometh
Explorer

Hello,
I have a search where I would like to compare the count of one search result against its running weekly average. This appears to work. However I would like to replicate this search across 14+ values for Field. I was building a dashboard with each field value as a separate report, and I couldn't help but to wonder if there was a way for me to append all the search results together. As written, I know I'll need to add an evaluation for naming the row, but it also parses incredibly slow. Both searches share the portion "foo", differing only by their Field value, is there any way to reuse the search result in parallel like this or is my general approach wrong?

foo Field="Bar" earliest=-7d latest=@h
| timechart span=1h count
| eval StartTime=relative_time(now(), "-24h@h")
| eval Series=if(_time>=StartTime, "Todays ", "Average ")
| eval Hour=strftime(_time, "%H")
| chart avg(count) by Hour Series
| where Hour=strftime(now(), "%H")
| append 
    [search foo Field="Baz" earliest=-7d latest=@h
| timechart span=1h count
| eval StartTime=relative_time(now(), "-24h@h")
| eval Series=if(_time>=StartTime, "Todays ", "Average ")
| eval Hour=strftime(_time, "%H")
| chart avg(count) by Hour Series
| where Hour=strftime(now(), "%H")]
0 Karma

DalJeanis
SplunkTrust
SplunkTrust

tl;d - try the last method first, as it's the most efficient if your date extractions are properly set up.


Here's one way:

earliest=-7d@h latest=@h
index=foo (Field="Bar" OR Field="Baz")
 | timechart span=1h count by Field
 | eval StartTime=relative_time(now(), "-24h@h")
 | eval Series=if(_time>=StartTime, "Todays ", "Average ").Field
 | eval Hour=strftime(_time, "%H")
 | chart avg(count) by Hour Series
 | where Hour=strftime(now(), "%H")

By the way, the above where clause looks like it would tend to get the data for the hour that has not yet completed. You might want to use

| where Hour=strftime(relative_time(now(),"@h"),"%H")

or

| addinfo
| where Hour=strftime(info_max_time-.0001,"%H")
| fields - info_*

However, since you only need one hour's worth, it might be better to do it this way:

earliest=-7d@h latest=@h
index=foo (Field="Bar" OR Field="Baz")
 | bin _time span=1h
 | eval Hour=strftime(_time, "%H")
 | where Hour=strftime(relative_time(now(),"@h"), "%H")
 | stats count first(Hour) as Hour by _time  Field 
 | eval Series=if(_time>now()-14400, "Todays ", "Average ").Field
 | chart avg(count) as count by Hour Series

You may need a table command after that to put them in the desired order, but that's your required fields.


And, one more update to make it the most efficient possible. Check one record in verbose mode to see if it has the field date_hour. If so, then your code simplifies immensely. This code will only retrieve records for the particular hour you are interested in, and avoids recalculating anything.

earliest=-7d@h latest=@h
index=foo (Field="Bar" OR Field="Baz")
    [|makeresults | addinfo | eval  date_hour=strftime(info_max_time-.001,"%H")  | table date_hour]
 | fields Field date_hour
 | bin _time span=1h
 | stats count first(date_hour) as Hour by _time  Field 
 | eval Series=if(_time>now()-14400, "Todays ", "Average ").Field
 | chart avg(count) as count by Hour Series

As with the last version, you may need a table command after that to put them in the desired order, but that's your required fields.

0 Karma

niketn
Legend

@jasonmadesomething tell the purpose of your SPL, why you need two searches? Are you querying different data set (like index/sourcetype) or different timeline? Seems like you dont need subsearch at all.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

to4kawa
Ultra Champion

try untable after timechart count by Field

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

Splunk is officially part of Cisco

Revolutionizing how our customers build resilience across their entire digital footprint.   Splunk ...

Splunk APM & RUM | Planned Maintenance March 26 - March 28, 2024

There will be planned maintenance for Splunk APM and RUM between March 26, 2024 and March 28, 2024 as ...