Splunk Search

Error and Warning Count by Hour or date or timestamp

satyajit2007
Explorer

I have my spark logs in Splunk . 

I have got 2 Spark  streaming jobs running .It will have different logs ( INFO, WARN, ERROR etc) .

I want to create a dashboard for the error Count  by hour or any better way ( suggest please)

index=myindex AND (sourcetype=sparkjob1 OR sourcetype=sparkjob2 ) | stats count as total_logs count(eval(level="INFO")) as total_errors</query>

Please also advise if you have any better suggestion with useful dashboard. 

 

Labels (2)
Tags (1)
0 Karma
1 Solution

thambisetty
SplunkTrust
SplunkTrust
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 )  | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
————————————
If this helps, give a like below.

View solution in original post

thambisetty
SplunkTrust
SplunkTrust
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 )  | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
————————————
If this helps, give a like below.

satyajit2007
Explorer

thank you a lot .

1) As i have 2 applications Source types, do i need to make separate graphs? Can i do something by which i can identify those error/Info/warn belongs to which Job ?

2) Also If i need to do to add some texts like error, failed, exceptions to the ERROR bucket in the above example?

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...