Splunk Search

How to edit my search to chart the count of how many sources were indexed in the last hour?

shahzadarif
Path Finder

I want to create a scheduled report that would count how many log files we’ve received in last hour. This is what I’ve written:

tstats dc(source) as "source" where index=“myindex”

Its working well, but I can’t figure out how to create a chart from it. I’ve tried chart and timechart commands, but I must be doing something wrong. I want it to run against the data for last 48 hours and I want the chart to have a span of 1 hour so I can see/show how many log files Splunk is getting in that one hour.

0 Karma
1 Solution

pradeepkumarg
Influencer
| tstats dc(source) as source where index=my_index by _time span=1h

View solution in original post

pradeepkumarg
Influencer
| tstats dc(source) as source where index=my_index by _time span=1h

shahzadarif
Path Finder

Thank you, that's exactly what I was looking for!
My next question which I hope you can help. Would it be possible to re-write the date timestamps its giving in the output? Timestamp is like 2016-09-03T13:00:00.000. I want to re-write them so it displays just the hour for example 13:00 or 13h. In the chart you can't read the time unless you hover your mouse over a part of the chart. Thanks.

0 Karma

pradeepkumarg
Influencer

Now that you have the _time field in your result set, you can use timechart command. Append below to your search

| timechart span=1h avg(source) as source

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...