Alerting

Splunk Alerting rate

dmcintosh1972
Explorer

Hi

I am looking at setting up alerting in splunk, at the moment I don't know the expected frequency or volumes of alerts, are there any performance issues I should consider. We have a 3 node search cluster, 4* indexers.

Are the searches spread across the searchheads? is it possible to fix them to a single Searchhead?

Appreciate any advice.

0 Karma

burwell
SplunkTrust
SplunkTrust

Hello.

Normally the captain helps distribute the searches around to the search head members.

To answer your question, you could restrict some of your search heads to only run adhoc searches: https://docs.splunk.com/Documentation/Splunk/7.2.3/DistSearch/Adhocclustermember

I think it's a good idea to have at least four search heads in your cluster. That way you can take one down without disturbing the cluster.

To keep an eye on your cluster you can use the search head clustering dashboard https://docs.splunk.com/Documentation/Splunk/7.2.3/DistSearch/SHCsettings

You can also get a lot of information from the monitoring console. https://docs.splunk.com/Documentation/Splunk/7.2.3/DMC/DMCoverview

I use the monitoring console to alert me when scheduled searches are getting skipped, for example. We also use it to alert us when the captain changes (frequent changes might indicate some kind of problem.)

You can also view the length of time of your long running schedules tasks etc.

dmcintosh1972
Explorer

Thanks, I had considered adding a 4th but was looking to fix this particular set of searches to it. If I am reading the doc correctly I could consider setting a couple to adhoc_searchhead = true then monitor for skipped searches occurring on the remaining 1 (or 2 if I and another) searchheads.

0 Karma
Get Updates on the Splunk Community!

Splunk Enterprise Security: Your Command Center for PCI DSS Compliance

Every security professional knows the drill. The PCI DSS audit is approaching, and suddenly everyone's asking ...

Developer Spotlight with Guilhem Marchand

From Splunk Engineer to Founder: The Journey Behind TrackMe    After spending over 12 years working full time ...

Cisco Catalyst Center Meets Splunk ITSI: From 'Payments Are Down' to Root Cause in ...

The Problem: When Networks and Services Don't Talk Payment systems fail at a retail location. Customers are ...