Getting Data In

How would I read in a log entry with uncommon time formats?

dskillman
Splunk Employee
Splunk Employee

Log entries have timestamps with Taiwan years. Taiwan year = current year-1911, so this year is 99. By default Splunk sees the time as the year 1999 and shows old data

990413 10:14:25  = April 13, 2010 10:14:25

Is this something I can use datetime.xml for? Maybe an offset?

Tags (2)
1 Solution

gkanapathy
Splunk Employee
Splunk Employee

I don't know of a way to have it read that. Splunk uses strptime, plus a few additions (like %Z, %3N, and I think it might be able to pick up hexadecimal epoch time) but I am not aware of a way to offset dates or times at index time.

View solution in original post

gkanapathy
Splunk Employee
Splunk Employee

I don't know of a way to have it read that. Splunk uses strptime, plus a few additions (like %Z, %3N, and I think it might be able to pick up hexadecimal epoch time) but I am not aware of a way to offset dates or times at index time.

jrodman
Splunk Employee
Splunk Employee

Support for offsets (or taiwanese years) would be an enhancement request. For this case you might be able to get away with a strptime that ignores the year, with a TIME_PREFIX that skips past it (be sure your regex doesn't fail next year when they go to 100). We should be able to default to the current year. Untested.
4 digit years are highly recommended. Sounds like Taiwan will go through this learning experience next year.

dskillman
Splunk Employee
Splunk Employee

It can definitely pick up hex epoch time.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...