Alerting
Highlighted

Is there a way to continue generating alert until a specific search condition is met?

Contributor

Hi,

We have an alert set to sent email each time a Firewall failover occurs. Alert condition is pretty straightforward.

sourcetype=<Source Type> m=144 OR m=145

m=144 is "Primary firewall has transitioned to Active" where m=145 is "Secondary firewall has transitioned to Active". "host" field lets us know the firewall in question. Alert is working as expected. We get one alert when the secondary becomes active and then another when Primary takes over.

Is there a way to create an condition such that once it detects Secondary is Active(i.e. m=145), Splunk will send one alert and then continue to do so at a set frequency(Say once every 2 hours etc) until it sees a m=144 event type, i.e. Primary is now Active , coming from the same firewall(host field). Looks like I'll have to use some correlation command like transaction to achieve this.

Any ideas?

Thanks,

~ Abhi

0 Karma
Highlighted

Re: Is there a way to continue generating alert until a specific search condition is met?

Engager

If you add a

| search m=145

at the end of your search query, that will only cause it to fire when the secondary is active. Then, in your alert settings, turn on Throttling and set the time range to 2 hours. That should do what you need.

0 Karma
Highlighted

Re: Is there a way to continue generating alert until a specific search condition is met?

Esteemed Legend

Like this:

index=YouShouldAlwaysSpecifyAnIndex AND sourcetype=<Source Type> AND (m=144 OR m=145)
| appendpipe [ |inputlookup activeSecondary.csv | eval sourcetype="From lookup", mcount="145"  ]
| eventstats dc(m) AS mCount BY host
| appendpipe [ where m=145 AND mCount=1 | dedup host | table _time host | outputlookup activeSecondary.csv ]
| where (mCount>1 OR  m=145)

Then set your alert to trigger for when Number of events and Is greater than 0

0 Karma