Splunk Search

Convert timestamp in event to specific timezone

leftrightleft
Explorer

Hey Splunk Gurus-

I'm attempting to calculate the duration between when an event was first identified (which is an entry in the event "alert.created_at") and the "_time" timestamp.

I'm able to calculate this timestamp difference using strptime("alert.created_at") but the conversion of that time to epoch is relative to the viewers timezone.  The duration changes based on how you configure the Splunk UI timezone.

The "_time" field is set to "current" in props.conf

Here's my current search:

 

index=* alert.tool.name=* action="fixed" 
| eval create_time=strptime('alert.created_at', "%Y-%m-%dT%H:%M:%SZ")  
| eval duration = _time - create_time 

 

 

Here's a sample of the log:

 

{
    "action": "fixed",
    "alert": {
        "number": 2,
        "created_at": "2021-11-22T23:49:19Z"
    }
}

 

 

When I execute this search while my UI preferences are set to "GMT" the result is 1183959 which is the correct duration.  When I set that preference to "PST", the result is 1155159.  That number is wrong by exactly 8 hours.

Any suggestions on how to deal with this?  I'm fine with either a search-time solution or a config change in props.conf if that's best.

Thanks!  

Labels (1)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust

It seems it is able to recognize it. In your strptime you don't use field code for timezone.

Check out my example - I render a local timestamp (I'm in CET, you might be somewhere else) but instead of my local timezone I add a constant "Z". Then I reparse the string with %Z interpreting timezone properly as UTC.

| makeresults 
| eval time=strftime(_time, "%Y-%m-%d %H:%M:%SZ")
| eval _time=strptime(time, "%Y-%m-%d %H:%M:%S%Z")

 

View solution in original post

0 Karma

johnhuang
Motivator

Try this:

index=* alert.tool.name=* action="fixed"
| eval create_time=strptime('alert.created_at'."+00","%Y-%m-%dT%H:%M:%SZ%z")
| eval duration=_time-create_time

 

 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Unfortunately (or not - depending on how you look at it), there is no possibility to set a timezone different than your user's configured timezone for a single search. That's one.

And two - if there  is no timezone information within the parsed time string, it is parsed according to your local timezone. Makes sense.

So if you want to make sure the time string is parsed according to particular timezone, make sure it's included in the string and format your timespec string accordingly.

I'm not sure (would have to check but don't have access to my splunk at the moment) if "Z" is recognized or if you have to change it on the fly to UTC or GMT.

0 Karma

leftrightleft
Explorer

Yeah that's kinda my hang up.  The timestamp contains a "Z" which is part of the ISO 8601 definition.  I was really hoping strptime() would be able to recognize it.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

It seems it is able to recognize it. In your strptime you don't use field code for timezone.

Check out my example - I render a local timestamp (I'm in CET, you might be somewhere else) but instead of my local timezone I add a constant "Z". Then I reparse the string with %Z interpreting timezone properly as UTC.

| makeresults 
| eval time=strftime(_time, "%Y-%m-%d %H:%M:%SZ")
| eval _time=strptime(time, "%Y-%m-%d %H:%M:%S%Z")

 

0 Karma

leftrightleft
Explorer

It was as simple as getting that "%" in place.  Thanks!

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...