Splunk Search

Error and Warning Count by Hour or date or timestamp

satyajit2007
Explorer

I have my spark logs in Splunk . 

I have got 2 Spark  streaming jobs running .It will have different logs ( INFO, WARN, ERROR etc) .

I want to create a dashboard for the error Count  by hour or any better way ( suggest please)

index=myindex AND (sourcetype=sparkjob1 OR sourcetype=sparkjob2 ) | stats count as total_logs count(eval(level="INFO")) as total_errors</query>

Please also advise if you have any better suggestion with useful dashboard. 

 

Labels (2)
Tags (1)
0 Karma
1 Solution

thambisetty
SplunkTrust
SplunkTrust
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 )  | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
————————————
If this helps, give a like below.

View solution in original post

thambisetty
SplunkTrust
SplunkTrust
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 )  | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
————————————
If this helps, give a like below.

satyajit2007
Explorer

thank you a lot .

1) As i have 2 applications Source types, do i need to make separate graphs? Can i do something by which i can identify those error/Info/warn belongs to which Job ?

2) Also If i need to do to add some texts like error, failed, exceptions to the ERROR bucket in the above example?

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud | Unified Identity - Now Available for Existing Splunk ...

Raise your hand if you’ve already forgotten your username or password when logging into an account. (We can’t ...

Index This | How many sides does a circle have?

February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...