Getting Data In

Show all buckets of _time in visualization including zero counts

jcrochon
Explorer

Hello Splunker,

I'm trying to display a 'timestamp, event count' visualization including counts of zero events

host="[redacted]" src_port=* | bucket _time span=1h | stats count by _time | eval _time=strftime(_time, "%Y-%m-%d %H:%M")

Results
2017-09-05 12:00 2
2017-09-05 13:00 1
2017-09-05 16:00 1

Expected Results
2017-09-05 12:00 2
2017-09-05 13:00 1
2017-09-05 14:00 0
2017-09-05 15:00 0
2017-09-05 16:00 1

0 Karma
1 Solution

DalJeanis
Legend

Try this -

host="[redacted]" src_port=* 
| bucket _time span=1h 
| stats count as mycount by _time 
| appendpipe 
    [| stats min(_time) as mintime max(_time) as maxtime 
     | eval maxtime=maxtime+1 
     | eval mytime=mvrange(mintime, maxtime,3600) 
     | mvexpand mytime 
     | eval _time=mytime 
     | eval mycount=0 
     | table _time mycount
     ]
| stats sum(mycount) as mycount by _time 
| eval _time=strftime(_time, "%Y-%m-%d %H:%M")

View solution in original post

DalJeanis
Legend

Try this -

host="[redacted]" src_port=* 
| bucket _time span=1h 
| stats count as mycount by _time 
| appendpipe 
    [| stats min(_time) as mintime max(_time) as maxtime 
     | eval maxtime=maxtime+1 
     | eval mytime=mvrange(mintime, maxtime,3600) 
     | mvexpand mytime 
     | eval _time=mytime 
     | eval mycount=0 
     | table _time mycount
     ]
| stats sum(mycount) as mycount by _time 
| eval _time=strftime(_time, "%Y-%m-%d %H:%M")
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...