The first time format is
Fri Dec 21 11:17:30 2018
the other one is
2018-12-21T11:17:31.051061
I was wondering how i would line break this, and also, how would I format the time format to accept both times?
By accident I ran into the same problem.
The brute force solution looks like:
| eval my_time1=strptime(genZeit, "%Y-%m-%dT%H:%M:%S%:z")
| eval my_time2=strptime(genZeit, "%Y-%m-%dT%H:%M:%S.%3N%:z")
| eval my_time=if(isnotnull(my_time1),my_time1,my_time2)
Try to convert the time with both of the possible time formats (be careful: my example will not reflect the time format of the original question), take the result which is not null.
Although this problem is different to the OP's problem, there is another way to handle multiple date formats, e.g. by using coalesce and the multiple date formats in descending order of probability
| eval my_time=coalesce(
strptime(genZeit, "%Y-%m-%dT%H:%M:%S%:z"),
strptime(genZeit, "%Y-%m-%dT%H:%M:%S.%3N%:z"))
Cool, works like a charm 😁
Gentlemen
Thanks for the follow-up about when to fix such issues.
As a pure user, I have no influence on everything that happens during indexing.
I can just take the data which is there and work with what I get.
Sure. As a user you can't directly change anything that's happening during data ingestion but a properly maintained environment should allow for feedback to the data quality. A badly ingested data is flawed data and often a simply useless data. Data with wrongly issued timestamp is simply not searchable in the proper "space in time".
While it might "work" it's definitely a bad idea to handle the main event's time this way. The _time field is the most important time field associated with an event and - very very importantly - it's the basic field for initial event filtering so just assigning "something" to it and then later handling time in search time is very unusual, confusing and ineffective performance-wise.
If a single file has more than 1 timestamp format then the developers should get a serious paddling and either split the events or pick one format and stick to it. Until that happens, you can force Splunk to look for both with a custom datetime.xml
file:
https://www.splunk.com/blog/2014/04/23/its-that-time-again.html
Each unique format should be tied to a sourcetype. You create base configs that tell Splunk how to read the timestamp and break the events properly relative to the sourcetype. In theory, you write the sourcetype rules once for each log format and you tie new events to that sourcetype
Yeah I know that. What I was wondering is there a way to properly format two different time formats located in one log file
Well yeah.. If they are in the same log file then assuming they are of the same type, they should be in he same format. If not, then you can route them to a different sourcetype
Use the documentation for sourcetype override. Have timestamp parsing for both sourcetypes as per your needs. While pulling data from index, pull both sourcetypes: