All Apps and Add-ons
Highlighted

Time index incorrectly and need query to sort this out

Explorer

Hi Gurus,

I have data coming into splunk in multiple columns e.g.

ID tracker created updated status hostname username
1 date&time date&time
2
3
4
5

My issue is the way i am taking this data in (cron job) takes the most updated data first which indexes the data by updated time daily instead of indexing it by created data.

So when i go to a search on this data it searches on updated data which does not give me the correct information. So the two date fields are "updated" and "created" how do i search on created rather than updated. I attempted to use this "eval Created=strftime(_time, "%d/%m/%Y %I:%M:%S %p")" and though it done it but it gives sporadic results which means its not working.

Would really appreciate some assistance.

0 Karma
Highlighted

Re: Time index incorrectly and need query to sort this out

SplunkTrust
SplunkTrust

If your data is being indexed by updated time rather than created time then your props.conf setting probably need to be changed. If you post the settings along with some sample events, we can help you correct them.

---
If this reply helps you, an upvote would be appreciated.
0 Karma
Highlighted

Re: Time index incorrectly and need query to sort this out

Contributor

I agree with richgalloway in that you should be indexing the data time stamp properly at ingest time by leveraging the props.conf file. But until you correct that it seems that you would want to eval _time = Created date

0 Karma
Highlighted

Re: Time index incorrectly and need query to sort this out

Explorer

Thanks alot but i dont want to change the way this data comes into all of my splunk instances just for this once .csv data stream. So if i change the props.conf it sounds like it will change the way splunk takes in data. So ive been using eval Created=strftime(_time, "%d/%m/%Y %I:%M:%S %p") to try and do this for me but i am not convinced its working. I am getting some of the data and i am seeing updated times younger then created times which can be possible. thanks

0 Karma
Highlighted

Re: Time index incorrectly and need query to sort this out

SplunkTrust
SplunkTrust

The strftime function creates a human-readable date-time string. That's fine for display in reports, but is difficult to use in comparison and calculations. Confusingly, you say events are indexed by update time, which would make _time be update time, but then you assign that to created time.
As @colinmchugo suggested, try eval _time=created.

---
If this reply helps you, an upvote would be appreciated.
0 Karma
Highlighted

Re: Time index incorrectly and need query to sort this out

Explorer

Hi Guys sorry i got swamped. Basically the update is when the data is pulled in via a curl to the api it makes the _time date/time the date and time the data was pulled and i don't know how to change this right now.

So i am left with "dirty data" so thanks for the tips but if i go and check something using the standard time range it doesnt pick the correct time. Any ideas beside the props or should I use a manually created time picker with some earliest latest search ?

0 Karma
Highlighted

Re: Time index incorrectly and need query to sort this out

Contributor

Hi colinmchugo , I think you have it backwards for time conversion. Splunk uses time to for ordering time series data. From what you've mentioned so far, the _time is incorrect and the timestamp you are looking to use is in your event data labeled as "Created" So in essence, what you are looking to use for your timestamp is "Created". With the statement you are currently using:
Created=strftime(
time, "%d/%m/%Y %I:%M:%S %p")
You are populating the Created field with the _time, which is the timestamp that splunk generates when the data is captured. What I believe you need to do here is:

eval _time=strptime(Created, "%d/%m/%Y %I:%M:%S %p")

Keep in mind however that the time range picker uses _time field that's available in the base search, no the calculated _time afterwards. The workaround is not very optimal as it has to query either all times or sufficiently large time range where you know logs you want to query exist.

0 Karma