We have a question related to Splunk Alert getting triggered in the night and sending us false alarms. Splunk Instance - https://<<InstanceNameHere>>:8000
The alert search is scheduled to run every 5 min and send us an alarm when it sees no events. So, what going on is between 2-3 am CT - Splunk starts sending us false alarms because it didn't see any events in the search. But when we looked back the events in that time frame we can find events there.
This brings us to a conclusion that either the events are getting delayed to reach splunk or the alert search itself cannot see those events (this can be very strange and don't know how).
While investigating, we came across a troubleshoot doc https://docs.splunk.com/Documentation/Splunk/7.0.2/Troubleshooting/Troubleshootingeventsindexingdelay . As per the suggestion, we ran the search with eval delay_sec=_indextime-_time| timechart min(delay_sec) avg(delay_sec) max(delay_sec) by host and found that the delays are higher for 2-3 am CT in comparison to other times in the day.
We checked for the thruput limit configured for this splunk instance and found the value is set to 512 KBps. We are not sure if this is cause of the problem. We are still researching, but if anyone has seen this or knows about it, please do let us know to fix the issue.
... View more