Yesterday afternoon I updated a number of files which had missing data in them in a directory which Splunk's tailing processor was monitoring. Before updating the files, I did a query for the specific source files and |'d through delete.
After updating the files, the timestamps are not being read correctly and everything is attributed to yesterday's re-reading time:
5/3/11 12:52:25.000 PM INFO 2011-04-18 17:44
I thought I'd try to fix this by updating props.conf with:
[source::/A/B/c/dirwithfiles] TIME_PREFIX = INFO\s TIME_FORMAT = %Y-%m-%d %H:%m
Which I believe should be correct, but, the data is still incorrectly timestamped when I query -- I have the bad feeling, and am thus looking for confirmation here, that I have to remove and re-index everything for the props.conf to be read correctly.
Yes?
Or is there another way, a better way, a Splunktastic way?
Timestamp parsing applies as Splunk indexes new data, so any changes you make to that in props.conf will not affect data that has already been indexed. So, in order to get your timestamps right, you will indeed need to reindex your data after making your changes to props.conf (and restarting Splunk in order to activate these changes).
Timestamp parsing applies as Splunk indexes new data, so any changes you make to that in props.conf will not affect data that has already been indexed. So, in order to get your timestamps right, you will indeed need to reindex your data after making your changes to props.conf (and restarting Splunk in order to activate these changes).
Thanks. I was afraid of that. I tried it on a few files and that seemed to be the result, but I was hoping to avoid the hassle and be told of a quicker way around it (and a way less likely to trip license issues given the amount of data...
I should mention that after the timestamp is the rest of the line with useful data that I'm searching for, I just did not include that here.