A very strange behaviour has occurred, we have defined a saved search that gets stored into its own Summary Index, below is the saved index stanza we have defined,
[Unique GroupId Type]
action.email.inline = 1
action.summary_index = 1
action.summary_index._name = summary_index
alert.severity = 2
alert.suppress = 1
alert.suppress.period = 1h
alert.track = 1
# Search, run this daily at 5:00 am
cron_schedule = 0 5 * * *
description = <description>
dispatch.earliest_time = -1d@d
dispatch.latest_time = now
enableSched = 1
realtime_schedule = 0
search = index="index_data" GroupId!="null" | fields GroupId | stats count by GroupId
The data we have is 140 days back in time, so by changing the dispatch.earliest_time = -140d@d
, and restart Splunk, ther summary index gets populated with the data successfully.
Now here's the strange part, after setting the configuration parameter back to dispatch.earliest_time = -1d@d
and restart Splunk, all the data disappears, the data that was stored there originally is gone, keeping in mind that we have no current data or any data from yesterday, so we do not expect any new data, but at least the saved search would have kept the old data that was there originally, all my other saved searches are pretty much the same but are working, I cant find the issue here to why its removing the data.
Any ideas?
Apparently the issue was with the Search itself, you need to include the _time field in the search, in my case I was omitting it as part of the raw search which is why the data was being removed from the summary index.
All good now
Apparently the issue was with the Search itself, you need to include the _time field in the search, in my case I was omitting it as part of the raw search which is why the data was being removed from the summary index.
All good now