I have several data sources that have each their own timestamp(different times, one format) due to Geo differences, however according to source_type time settings, they should all be indexed according to the time of the enterprise.
That works like a charm most of the times, however every now and then, SPLUNk decides to change the year of my events from 2020 to 2019, or some other time change. This does not happen to all indexes,
For example , I read the _audit for fired alerts, which are then dashboareded correctly, however, the events themselves are nowhere to be found, unless I decide to change the search to 2019.
So far I can not relate such situations to anything out of normal. Any ideas?
I spent a bit more time trying to narrow down the actual problem, and have some new findings. The year change actually happened only once, and it could have been because of a SPLUNK restart, or restart of some of the machines with the Forwarders(or any other event) however, i managed to find out that it is not the year change was that is the real problem .
It seems that the search plays some game with us when searching for events. here is an explanation.
An event is logged and existing(known date, know hour)
alert is triggered - correctly
alert_fire event is logged correctly in _audit
if search for the actual event in logs (_main index)is performed using "All time" option, the event is present in the results
if the search is performed using any "Relative" option the event is not present - despite the event being within the relative range
if the search is performed using "Date range" or "Date and time range" with "before" sub-option, the event is there and it is fine.
If i use the aforementioned options with "Since or Between" sub-options.....nothing
That is for every type of event , and so far I do not see any connection to Sourcetype or Host.....