Alerting

Splunk stopped sending email alerts for some alerts

sheaross
Explorer

Splunk sending email alerts for some of my alerts not all of them.  I have scheduled alerts that run each day at specific times.  These alerts run the query at a runtime of 1-10 seconds.  Nothing has changed in the Splunk environment.  I run this command:  index=_* (ERR* OR FAIL* OR WARN* OR CANNOT) (email OR sendemail).  9 results are returned and I find this error.  ERROR:root:(452, '4.3.1 Insufficient system resources (UsedDiskSpace[E:\\Program Files\\Microsoft\\Exchange Server\\V15\\TransportRoles\\data\\Queue])').

I've checked with IT and they stated there are no issues with the exchange server, but like I stated above some alerts work and others do not.  Any guidance you guys can provide would be great.

Labels (2)
Tags (1)

vikramyadav
Contributor

Hi @sheaross ,

Before checking any errors can you check whether your alert has been triggered or not?
You can check from  here http://yourserver:8000/en-US/alerts/search

If you can see your alert has been triggered and still you are not able to see the alert kindly let me know.

--------------------------------------------------------

If this helps your like will be appreciated 😊

sheaross
Explorer

Sorry about that, these are more reports than alerts.  These reports send an email at a scheduled time of the day.  Some work and some do not work.  I've looked at the Job Manager and these reports have been executed but some were not sent while others are sent.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...