Alerting

Can you help me figure out why alert isn't being triggered?

New Member

Hi everyone,

I'm trying to set up an alert for daily license usage which would notify me when it reaches a certain threshold.

| rest splunk_server=shaklee-splunk-enterprise /services/licenser/pools | rename title AS Pool | search [rest splunk_server=shaklee-splunk-enterprise /services/licenser/groups | search is_active=1 | eval stack_id=stack_ids | fields stack_id] | eval quota=if(isnull(effective_quota),quota,effective_quota) | eval percentage=round(used_bytes/quota*100,2) | where percentage >= 8 | fields percentage

This is my query for when the pool reaches 8%. The search works and pulls the integer out for me. But, the problem is the alert will not trigger when I set it for cron to scan every second and trigger when number of results is greater than 0.

Any ideas?

UPDATE: The reason this is not triggering is because this search from the example above does not provide any events but does show statistics. I will need to update my alert to trigger upon finding this statistic. Anyone know how to define that?

Thanks,
Ryan

0 Karma

Motivator

Hi,

While configuring "Trigger Conditions", choose "Custom" option and specify something like this search percentage>8. This is evaluated against the results of the base search. So, I would say, you don't need where clause in your base search.

On side note: It is not efficient to configure alert to run every second. There's chance that these savedsearches will be skipped if search head is busy. See if you can extend your cron schedule to run alert less frequently than every second.

0 Karma

SplunkTrust
SplunkTrust

Hi @rung8,

If you are getting only 1 event in output in that case you need to set greater than 0 not greater than 1

0 Karma

New Member

My mistake, I have also tried that as well when I realized this earlier. It still didn't trigger.

0 Karma