Splunk Search

How to convert this string to a date?

nicocin
Path Finder

I'm trying to convert a string to a date.

The string looks like 2016-05-20T05:16:02.007+02:00

Tags (4)
0 Karma
1 Solution

javiergn
Super Champion

Try this:

| stats count
| fields - count
| eval timestamp = "2016-05-20T05:16:02.007+02:00"
| eval timestamp_epoch = strptime(timestamp, "%Y-%m-%dT%H:%M:%S.%3N%z")

View solution in original post

javiergn
Super Champion

Try this:

| stats count
| fields - count
| eval timestamp = "2016-05-20T05:16:02.007+02:00"
| eval timestamp_epoch = strptime(timestamp, "%Y-%m-%dT%H:%M:%S.%3N%z")

nicocin
Path Finder

Hmm I get something like this "1463714162.007000"

0 Karma

javiergn
Super Champion

Yes, that's epoch time and that's what Splunk uses internally to represent dates and times.

http://www.epochconverter.com
https://en.wikipedia.org/wiki/Unix_time

What were you planning to do with that timestamp after that?
Now that it's in the right format you can use strftime and plenty or other functions to work with it. For example:

| eval date = strftime(timestamp_epoch, "%Y-%m-%d")

nicocin
Path Finder

Ahh ok, working great like this.

Thank you!

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...