Splunk Search

How do you use fillnull to return a count of 0?

michael_ermino_
New Member

I have events that have a value called "Date First Found" that is of the format: "%m/%d/%Y". I calculate the number of days since this date, and group them based on their age. In the example below, the grouping is between 180 and 365 days. For this particular group there are no search results at the moment, so I want to return a count of 0. I tried using fillnull, however this does not work and only returns "No results found".

Am I missing something here?

search query | eval dateFirstFound=strptime('Date First Found', "%m/%d/%Y") | eval days_open = round((now()-dateFirstFound)/86400) | search (days_open>=181 AND days_open <=365) | bin span=1d _time | stats count by _time | fillnull value=0 count
0 Karma

woodcock
Esteemed Legend

Use the default behavior of timechart to create empty time buckets like this:

search query 
| eval dateFirstFound=strptime('Date First Found', "%m/%d/%Y") 
| eval days_open = round((now()-dateFirstFound)/86400) 
| search (days_open>=181 AND days_open <=365) 
| timechart span=1d count

richgalloway
SplunkTrust
SplunkTrust

While your approach is logical, it doesn't work. Use appendpipe.

search query | eval dateFirstFound=strptime('Date First Found', "%m/%d/%Y") 
| eval days_open = round((now()-dateFirstFound)/86400) 
| search (days_open>=181 AND days_open <=365) 
| bin span=1d _time 
| stats count by _time
| appendpipe [ stats count | eval daysCount=0 | where count==0 | fields - count | rename daysCount as count ]
---
If this reply helps you, Karma would be appreciated.
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...