I'm trying to create an alert that looks through a given list of indexes and triggers an alert for each index showing zero results within a set timeframe. I'm trying with the following search: ...
See more...
I'm trying to create an alert that looks through a given list of indexes and triggers an alert for each index showing zero results within a set timeframe. I'm trying with the following search: | tstats count where index IN (index1, index2, index3, index4, index5) BY index
| where count=0 But this doesn't work because running the first line on its own only shows the indexes that are not empty and nothing, not even count=0 for the empty index. I also tried | tstats count where index IN (index1, index2, index3, index4, index5) BY index
| fillnull count value=0
| where count=0 But that doesn't work either. The problem is that if "index5", for example, is showing no results, "| tstats count..." doesn't return anything, not even a null result. So something like "| fillnull" is not working at the end because there is no "index5" row to "fillnull". I have seen other solutions use | rest /services/data/indexes ... and join or append the searches to each other but since I'm on Splunk Cloud, it doesn't work due to the error "Restricting results of the "rest" operator to the local instance because you do not have the "dispatch_rest_to_indexers" capability". The only working solution I have so far is to create an alert for each index I want to monitor with the following search | tstats count where index=<MY_INDEX>
| where count=0 but I would rather have a single alert running each time with a list that I can change if I need to than multiple searches competing for a timeslot and all that. I have considered other solutions like providing a lookup table with a list of indexes I want to search and using lookup to compare against the results but that seems too cumbersome. Is there a way to trigger an alert for empty indexes from a single given list on Splunk Cloud?