Splunk Search

Real-time alerting with search head pooling

mark
Path Finder

Hi,

We have a distributed environment with 2 search heads in a pool (for LB and HA) running v4.3.0 (upgrading shortly).
When scheduling real-time searches, both search heads start processing the events simultaneously (There is splunkd search processes running on each search head).
Then when an alert if fired, both search heads trigger the alert (for example, both search heads send an email; even with throttling enabled).

1.Is it correct that both search heads run the scheduled real-time search? What is the benefit of this, as is just seems to put undue load on the environment?

2.Is it possible to restrict this real-time searching to only occur on one or the two search heads?

Thanks,
Mark

kallu
Communicator

Sounds a bit strange. Real-time searches aren't that much different from normal searches and Splunk is taking care that only 1 search head in a pool is running each scheduled search.

1) Search heads don't distribute jobs to another search heads but their search peers (aka indexers). If your search heads are also indexers then I suppose it's normal you see some activity on both systems.

2) You can disable all scheduled searches on search head. I assume this would also disable real-time searches. See "how does search head pooling work with scheduled searches?"
This might act as work-a-round for you problem.

BTW: Are you sure you aren't sending the same data to both your indexers? Ie. how did you verified both alerts were triggered from the same SINGLE event?

0 Karma
Get Updates on the Splunk Community!

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Shape the Future of Splunk: Join the Product Research Lab!

Join the Splunk Product Research Lab and connect with us in the Slack channel #product-research-lab to get ...