Splunk Search

stats count for different days in separate fields

maxmukimov
Explorer

Hi, 

I’m trying to get product count for yesterday and 7 days ago from yesterday in two separate fields, results are coming back correct for yesterday but for the second field all the results are zero. I wanted to know if my logic is correct.  

Here is what I have:

index = something host = something 
| where ResponseCode = “Success”
| stats count as “Product Count Yesterday”, 
        count (eval (relative_time(now(), “-8d@d”))) as “Product Count 7 days ago” by product
| sort product desc 

 

Thank you. 

Labels (4)
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

Just for fun to show you how many ways there are to achieve the same goal with Splunk, here are two ways you can also do it - there is a performance consideration - see comments at end

index = something host = something ResponseCode="Success" earliest=-8d@d latest=@d
| bin _time span=1d
| stats count as ProductCount by product, _time
| where _time=relative_time(now(),"-d@d") OR _time=relative_time(now(),"-8d@d")
| eval when=if(_time=relative_time(now(),"-d@d"), "Yesterday", "Last Week")
| eval "Product Count {when}"=ProductCount
| fields - _time ProductCount when
| stats values(*) as * by product
| sort - product

I have included the date ranges in the search itself. This will bin the counts be each day of the week for the last 8 days and then filter only yesterday and 8 days ago before then doing the field naming for the counts at the end.

This is a similar way, which does the count evaluation in the stats command itself

index = something host = something ResponseCode="Success" earliest=-8d@d latest=@d
| bin _time span=1d
| stats count(eval(_time=relative_time(now(),"-d@d"))) as "Product Count Yesterday" count(eval(_time=relative_time(now(),"-8d@d"))) as "Product Count Last Week" by product, _time
| where _time=relative_time(now(),"-d@d") OR _time=relative_time(now(),"-8d@d")
| fields - _time 
| stats values(*) as * by product
| eval "Product Count Yesterday"=mvfilter('Product Count Yesterday'>0), "Product Count Last Week"=mvfilter('Product Count Last Week'>0)
| sort - product

 

From a performance point of view, the second is less efficient, as it is evaluating relative_time for every event in the stats, which is not necessary, as that can be delayed until after the stats.

Also, the second will ignore any product counts where the count is 0 as it is removing 0 from the counts in the final eval

In my tests, the first example consistently takes ~9 seconds for 1.3m events, whereas the second takes ~13 for the same.

View solution in original post

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Just for fun to show you how many ways there are to achieve the same goal with Splunk, here are two ways you can also do it - there is a performance consideration - see comments at end

index = something host = something ResponseCode="Success" earliest=-8d@d latest=@d
| bin _time span=1d
| stats count as ProductCount by product, _time
| where _time=relative_time(now(),"-d@d") OR _time=relative_time(now(),"-8d@d")
| eval when=if(_time=relative_time(now(),"-d@d"), "Yesterday", "Last Week")
| eval "Product Count {when}"=ProductCount
| fields - _time ProductCount when
| stats values(*) as * by product
| sort - product

I have included the date ranges in the search itself. This will bin the counts be each day of the week for the last 8 days and then filter only yesterday and 8 days ago before then doing the field naming for the counts at the end.

This is a similar way, which does the count evaluation in the stats command itself

index = something host = something ResponseCode="Success" earliest=-8d@d latest=@d
| bin _time span=1d
| stats count(eval(_time=relative_time(now(),"-d@d"))) as "Product Count Yesterday" count(eval(_time=relative_time(now(),"-8d@d"))) as "Product Count Last Week" by product, _time
| where _time=relative_time(now(),"-d@d") OR _time=relative_time(now(),"-8d@d")
| fields - _time 
| stats values(*) as * by product
| eval "Product Count Yesterday"=mvfilter('Product Count Yesterday'>0), "Product Count Last Week"=mvfilter('Product Count Last Week'>0)
| sort - product

 

From a performance point of view, the second is less efficient, as it is evaluating relative_time for every event in the stats, which is not necessary, as that can be delayed until after the stats.

Also, the second will ignore any product counts where the count is 0 as it is removing 0 from the counts in the final eval

In my tests, the first example consistently takes ~9 seconds for 1.3m events, whereas the second takes ~13 for the same.

0 Karma

maxmukimov
Explorer

Thank you @bowesmana 

First option seems to be working, however, I'm not getting separate columns for Yesterday and Last week.  Results for both days are getting displayed in one column.   Attached the screenshot. 

1.PNG

0 Karma

bowesmana
SplunkTrust
SplunkTrust

@maxmukimov 

Can you post your exact search. I can see that 'Product Count' field name is there, which means the field assignation is happening, but without the value of {when}. In my example, when cannot be empty, so it shouldn't happen...

Thanks

 

0 Karma

maxmukimov
Explorer

@bowesmana 
There was a typo, instead of:

| stats count as ProductCount by product, _time

I typed  

| stats count as "Product Count" by product, _time

 

Thank you! 

0 Karma

richgalloway
SplunkTrust
SplunkTrust

If stats count is returning the right value for yesterday then your time picker must be set to yesterday.  That means no events will be read for last week so the second number will be zero.

Changing the time window to 8 days ago will break the first count. 

Try this query

index = something host = something earliest=-8d@d
| where ResponseCode = “Success”
| case period=case(_time<relative_time(now(), "-1d@d"), "yesterday", 1==1, "last week")
| stats count(eval(period="yesterday")) as “Product Count Yesterday”, 
        count(eval(period="last week")) as “Product Count 7 days ago” by product
| sort product desc
---
If this reply helps you, Karma would be appreciated.

maxmukimov
Explorer

Thank you. 

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...