Splunk Search

how to get count of events for each 30 min

srinivasup
Explorer

Hi,

index=_internal |timechart span=30m count --- Im using this query looking for last 4hr data.

2017-05-02 15:30:00 1430929
2017-05-02 16:00:00 3590625
2017-05-02 16:30:00 3594519
2017-05-02 17:00:00 3579337
2017-05-02 17:30:00 3552906
2017-05-02 18:00:00 1748658
2017-05-02 18:30:00 0
2017-05-02 19:00:00 0
2017-05-02 19:30:00 0

Present time is 19:30 but when we click on last event which is 2017-05-02 19:30:00 but it showing 2017-05-02 19:30:00 to 2017-05-02 20:00:00 but it should not look for events after 19:30. This is giving wrong count

Can anyone help us.

Tags (1)

DalJeanis
Legend

These two get the same results, but the first is MUCH faster on my system...

earliest=-4h@h latest=@h index=_internal 
| bin _time span=30m
| stats count by _time

earliest=-4h@h latest=@h index=_internal 
| bin _time span=30m
| timechart count 

I did notice that timechart takes a long time to render, a few 100K events at a chunk, whereas stats gave the results all at the same time. Your mileage may vary.

0 Karma

somesoni2
Revered Legend

Your last 4 hr time range includes time beyond 19:30 (even if a single minute is passed, there will be a bucket for 19:30. TO avoid that, you can include partial=f in your timechart command.

index=_internal |timechart span=30m partial=f count 

See documentation for more details on different options.
http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/timechart

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...