Splunk Search

Timespan based output

poddraj
Explorer

Hi
Can someone help me in getting o/p over 1h interval along with Total requests count, Success count, Failure count

I have written below query but am not getting the Total count as separate column for every 1h interval span

index=dte_fios sourcetype=dte2_Fios FT=*FT | eval Interval=strftime('_time',"%d-%m-%Y %H:%M:%S")
| eval Status=case(Error_Code=="0000","Success",1=1,"Failure")
| timechart span=1h count by Status

It is giving O/p as
_time Success Failure
2020-04-20 05:00 120 90

I need O/P as
_time Total Failure Success
2020-04-20 05:00 210 90 120

Labels (3)
Tags (1)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

You're not getting the total count because your query does not mention total count.

index=dte_fios sourcetype=dte2_Fios FT=*FT 
| eval Status=case(Error_Code=="0000","Success",1=1,"Failure")
| timechart span=1h count by Status
| eval Total = Success + Failure
| table _time Total Failure Success
---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

You're not getting the total count because your query does not mention total count.

index=dte_fios sourcetype=dte2_Fios FT=*FT 
| eval Status=case(Error_Code=="0000","Success",1=1,"Failure")
| timechart span=1h count by Status
| eval Total = Success + Failure
| table _time Total Failure Success
---
If this reply helps you, Karma would be appreciated.
0 Karma

poddraj
Explorer

Thanks....

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...