Splunk Dev

Generate every minute in query output,How to generate every minute, in the output, even if no events

didzej
New Member

Hi All,

we need to generate every minute, for the below query, to fix the source data for reporting purpose in other tool.
The thing is, we need to generate every minute, even there's no data and put 0 then.

stage=* sourcetype=nvp_test_log
| bin _time span=1m
| eval datDl = strftime (_time, "%Y-%m-%d %H:%M:%S")
| dedup _raw
| stats count As httpc by hpam_region, stage , host, dataowner_id, datDl
| fillnull
| TABLE httpc, datDl, dataowner_id, host, stage, hpam_region

The most important, is to save to grouping hierarchy as above.

Thank You very much!

Cheers,
Damian,Hi All,

I need to adjust the below query to generate every minute in the output. If there's no event in the given minute, we need to put 0.
It's important to save the grouping hierarchy: hpam_region, stage , host, dataowner_id, datDl

stage=* sourcetype=nvp_access_logging
| bin _time span=1m
| eval datDl = strftime (_time, "%Y-%m-%d %H:%M:%S")
| dedup _raw
| stats count As httpc by hpam_region, stage , host, dataowner_id, datDl
| fillnull
| TABLE httpc, datDl, dataowner_id, host, stage, hpam_region

Thank You!

Tags (1)
0 Karma

didzej
New Member

Hi,

thank You very much for the response. However does not return me anything.
Moreover we cannot use datDI=strftime(now(), "%Y-%m-%d %H:%M:%S") as we want to fulfill empty reocrds within given periods of log time.

Any idea?

Many thanks in advance!

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Try this. The appendpipe command adds dummy data if there are no events found by the query. Replace 'foo', 'bar', etc. with values that make sense for your data.

stage=* sourcetype=nvp_access_logging 
| bin _time span=1m 
| eval datDl = strftime (_time, "%Y-%m-%d %H:%M:%S") 
| dedup _raw 
| fillnull 
| appendpipe [ stats count | eval hpam_region=foo, stage=bar, host=baz, dataowner_id=bat, datDI=strftime(now(), "%Y-%m-%d %H:%M:%S") | where count==0 ]
| stats count As httpc by hpam_region, stage , host, dataowner_id, datDl 
| TABLE httpc, datDl, dataowner_id, host, stage, hpam_region
---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

SOC4Kafka - New Kafka Connector Powered by OpenTelemetry

The new SOC4Kafka connector, built on OpenTelemetry, enables the collection of Kafka messages and forwards ...

Your Voice Matters! Help Us Shape the New Splunk Lantern Experience

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Building Momentum: Splunk Developer Program at .conf25

At Splunk, developers are at the heart of innovation. That’s why this year at .conf25, we officially launched ...