Splunk Alert Creation for threshhold monitoring ?

Path Finder

Hi Friends ,

I want to create an alert for my Hadoop Job Monitoring and trigger an alert mail to team notifying or hihglighting only for jobs which has been running for more than 90mins based on which action can be taken.I am attaching the screenshot of my query .Please help me in modifying and fine tuning the query changes if needed. Before i proceed to set an alert for monitoring .alt text

Tags (2)
0 Karma

Ultra Champion
index=hadoopmon_db sourcetype=cm_yarn_metrics_live finalStatus=UNDEFINED elapsedTime > 5400 earliet=-90m@m latest=now 

To alert, if there is more than one result.

0 Karma



Below are my suggestions for your scenario. As per your requirement, you want a jobs which has been running for more than 90mins, so for this 1) you can filter event in search also. 2) You have to set threshold on id field for specific period. It will helps you to restrict alert flooding for single id.

Check below links for Alert Configurations:

Alert scheduling tips:

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!