I'm trying to use splunk train to learn my timestamps so I can put them in a datetime.xml file, and it won't even try. My timestamps look like: 20180529132292003-0700 and 20180529132292-0700
Skipping unpromissing line 1.
Skipping unpromissing line 2.
Why trying to train and not just specify the format explicitely with TIME_FORMAT?
As also discussed in your other question here: https://answers.splunk.com/answers/660301/why-arent-timestamps-being-recognized-consistently.html
PS: 20180529132292003-0700 is a strange timestamp. Based on your previous explanation, this has 92 as the seconds, which cannot be correct...
That was a typo. I changed it and it still didn't work. I'm trying to train because the data has timestamps with two different formats, and multiple posts made it sound like using datetime.xml would be "easy". I can't change the format of the timestamp in the data, it's an international standard for this type of data
My guess would be that, because of the lack of any seperators, it doesn't see a way to actually extract anything meaningful out of it. For a human, the timestamp is rather easy to read, but not for software. I'd try to either change the timestamps to contain seperators, or, just do this manually, by using either TIME_FORMAT or building a datetime.xml yourself.
However, I tried to build a timestamp extraction for this - and the format is pretty weird.
20180529132292003-0700, I'd split it like
2018-05-29 13:22, but after that - what is 92003 supposed to mean? Is it 0.92003 minutes? Or is this just a bad example?
You see, if a human has difficulties to get a proper timestamp out of it... how should Splunk do this? 😉
Hope that helps - if it does I'd be happy if you would upvote/accept this answer, so others could profit from it. 🙂
Okay, if subseconds are always present, you could simply use
TIME_FORMAT = %Y%m%d%H%M%S%3N%z. Splunk actually recognizes your timestamp without any additional training, but for some reason ignores the timezone information...