Alerting

How to configure an alert when the Indexer cluster restart takes more than 1 hour to complete.

Arpit_S
Path Finder

We want to create an alert in our instance that triggers only when the indexers takes more than 60 minutes to complete rolling restart process.

We currently have an alert that checks the indexer status in last 15 min and sends the list of indexers that were down in that time period. This causes ambiguity as whenever a rolling restart is done we start getting the alert.

Query:
index=_internal component=CMPeer to=StreamingError host=xxx | stats dc(peer_name) as Count, values(peer_name) as Indexer

We want this to trigger only when an indexers have been in that state since atleast an hour.

Thanks in advance,
Arpit

Tags (1)
0 Karma
1 Solution

nickhills
Ultra Champion

Sorry, I initially misread the question, try this:

...|eventstats earliest(_time) as et|eval moreThanHour=if((et<(now()-3600)), "yes","no") |search moreThanHour="yes"|...

So using your query:

index=_internal component=CMPeer to=StreamingError host=xxx |eventstats earliest(_time) as et|eval moreThanHour=if((et<(now()-3600)), "yes","no") |search moreThanHour="yes"| stats dc(peer_name) as Count, values(peer_name) as Indexer

You will need to make sure your search windows is more than 60 minutes, otherwise the condition will never be met.

If my comment helps, please give it a thumbs up!

View solution in original post

nickhills
Ultra Champion

Sorry, I initially misread the question, try this:

...|eventstats earliest(_time) as et|eval moreThanHour=if((et<(now()-3600)), "yes","no") |search moreThanHour="yes"|...

So using your query:

index=_internal component=CMPeer to=StreamingError host=xxx |eventstats earliest(_time) as et|eval moreThanHour=if((et<(now()-3600)), "yes","no") |search moreThanHour="yes"| stats dc(peer_name) as Count, values(peer_name) as Indexer

You will need to make sure your search windows is more than 60 minutes, otherwise the condition will never be met.

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

Preparing your Splunk Environment for OpenSSL3

The Splunk platform will transition to OpenSSL version 3 in a future release. Actions are required to prepare ...

Deprecation of Splunk Observability Kubernetes “Classic Navigator” UI starting ...

Access to Splunk Observability Kubernetes “Classic Navigator” UI will no longer be available starting January ...

Now Available: Cisco Talos Threat Intelligence Integrations for Splunk Security Cloud ...

At .conf24, we shared that we were in the process of integrating Cisco Talos threat intelligence into Splunk ...