Splunk Search

Error and Warning Count by Hour or date or timestamp

satyajit2007
Explorer

I have my spark logs in Splunk . 

I have got 2 Spark  streaming jobs running .It will have different logs ( INFO, WARN, ERROR etc) .

I want to create a dashboard for the error Count  by hour or any better way ( suggest please)

index=myindex AND (sourcetype=sparkjob1 OR sourcetype=sparkjob2 ) | stats count as total_logs count(eval(level="INFO")) as total_errors</query>

Please also advise if you have any better suggestion with useful dashboard. 

 

Labels (2)
Tags (1)
0 Karma
1 Solution

thambisetty
SplunkTrust
SplunkTrust
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 )  | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
————————————
If this helps, give a like below.

View solution in original post

thambisetty
SplunkTrust
SplunkTrust
index=myindex (sourcetype=sparkjob1 OR sourcetype=sparkjob2 )  | timechart count as total_logs count(eval(level="ERROR")) as total_errors span=1h
————————————
If this helps, give a like below.

satyajit2007
Explorer

thank you a lot .

1) As i have 2 applications Source types, do i need to make separate graphs? Can i do something by which i can identify those error/Info/warn belongs to which Job ?

2) Also If i need to do to add some texts like error, failed, exceptions to the ERROR bucket in the above example?

0 Karma
Get Updates on the Splunk Community!

Dashboards: Hiding charts while search is being executed and other uses for tokens

There are a couple of features of SimpleXML / Classic dashboards that can be used to enhance the user ...

Splunk Observability Cloud's AI Assistant in Action Series: Explaining Metrics and ...

This is the fourth post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how ...

Brains, Bytes, and Boston: Learn from the Best at .conf25

When you think of Boston, you might picture colonial charm, world-class universities, or even the crack of a ...