I'm trying to set up an alert for daily license usage which would notify me when it reaches a certain threshold.
| rest splunk_server=shaklee-splunk-enterprise /services/licenser/pools | rename title AS Pool | search [rest splunk_server=shaklee-splunk-enterprise /services/licenser/groups | search is_active=1 | eval stack_id=stack_ids | fields stack_id] | eval quota=if(isnull(effective_quota),quota,effective_quota) | eval percentage=round(used_bytes/quota*100,2) | where percentage >= 8 | fields percentage
This is my query for when the pool reaches 8%. The search works and pulls the integer out for me. But, the problem is the alert will not trigger when I set it for cron to scan every second and trigger when number of results is greater than 0.
UPDATE: The reason this is not triggering is because this search from the example above does not provide any events but does show statistics. I will need to update my alert to trigger upon finding this statistic. Anyone know how to define that?
While configuring "Trigger Conditions", choose "Custom" option and specify something like this
search percentage>8. This is evaluated against the results of the base search. So, I would say, you don't need
where clause in your base search.
On side note: It is not efficient to configure alert to run every second. There's chance that these savedsearches will be skipped if search head is busy. See if you can extend your cron schedule to run alert less frequently than every second.