Improving on the answer above, here is a version that DOES account for daylight savings timezones, as well as times with microseconds:
| makeresults | eval t=strptime("2017-01-01 14:00", "%F %H:%M"), from_tz="UTC", to_tz="Australia/Melbourne", from_t=strptime(strftime(t, "%c.%6N " . from_tz), "%c.%6N %Z"), to_t=strptime(strftime(t, "%c.%6N " . to_tz), "%c.%6N %Z"), offset=round((from_t-to_t)/60/60), converted=strftime(t + (from_t-to_t), "%c")
It can be simplified for brevity, I've just broken it into different variables to make it easier to follow. Also the offset variable is unused, but is just there so you can experiment by changing dates and timezones to see that the offset does actually change depending on whether daylight savings is applicable for the date. Note that if you put in explicit DST timezone names like 'AEDT' then it will ALWAYS be daylight savings, regardless of the date. But other timezone names (like Australia/Melbourne used in the example), which are sometimes DST and sometimes not, will change offset according to the input date. The list of timezone names appear to be the standard list from Java.
This is a perfect solution. I modified it slightly for my use case, which is to ensure that the user's timezone matches the timezone of the source data. So the from_tz is done this way:
Thanks. This seems to be working great. However, only one issue of DayLight Savings. When converting from EDT to Asia/Shanghai timezone, i get offset of 12 which should be 13 depending on time of the year. any idea how to fix this issue ?
Inspired by the excellent answers provided here, it is possible to convert to an arbitrary timezone provided that timezone always has a fixed UTC offset (i.e. does not support day light savings time).
eval _timezone = "AEST" | eval _time_AEST = _time - (strptime("2000-01-01 +00:00", "%F %:z") - strptime("2000-01-01 " . strftime(_time, "%:z"), "%F %Z")) + (strptime("2000-01-01 +00:00", "%F %:z") - strptime("2000-01-01 " . _timezone, "%F %Z")) | eval time_in_AEST = strftime(_time_AEST, "%F %T " . _timezone)
The above works by first subtracting the UTC offset of the timezone as configured in the users preferences, then adds on the UTC offset of the timezone you configure (in this example
AEST, but could be
-05:30 or any valid Splunk timezone identifier). The UTC offset of the two timezones in question is calculated by using a reference date (I chose 2000-01-01 arbitrarily) and parsing it twice, first using the UTC timezone and then the queried timezone, and the difference of the two yields the timezone offset.
A couple of things to remember:
_time_AESTvariable as seconds since epoch (1970-01-01 00:00:00 UTC), and so technically Splunk is interpreting this as a different 'real world' time -- if you attempt to print the timezone of the date, it will incorrectly report the users configured timezone. Instead, whenever printing this date always include the timezone manually and don't use the
%Ztimezone formats (as is done in the last line of the example).
"quotes stripped when passed as arguments to a macro).
_timeproperty). For our use case, we are uploading the 'real world time' using time since Epoch (assigned to the
_timeproperty), and additionally upload a
timestampfield formatted as a date in local time which Splunk interpets as a string. This gives Splunk enough information to assign the correct time to the event, but also allows us to run queries against the local time where the data is sourced from.
The 2 functions are
strftime. Here is an example string to convert
_time (which is stored as
EST within a search:
... | eval EST_time=strptime(strftime(_time,"%m/%d/%Y %H:%M:%S EST"),"%m/%d/%Y %H:%M:%S %Z") | eval date_month=strftime(EST_time,"%m") | eval date_mday=strftime(EST_time,"%d") | eval date_hour=strftime(EST_time,"%H")| stats count by date_month,date_mday,date_hour | sort count desc
I downvoted this post because this is not correct. this query says "pretend this utc date is actually a est date" which will add 5 hours (rather than subtract). what you want is to say "given this utc date, what would it be in est time"? this is not possible to do simply in splunk.
Downvoting users' answers/comments in this forum should be reserved for suggestions that could potentially do harm to someone's environment. People are just trying to help each other out and learn. Simply commenting on @woodcock's answer would have sufficed. Please review how voting etiquette works on Splunk Answers for providing positive, constructive engagement within the community:
Hi @rock7177 and @woodcock, thank you for pointing out that my comment was not fully explained. Apologies for incorrectly using down vote, it will not happen again (unfortunately I can't remove the vote now - "Sorry, but you can't cancel a vote after more than 1 day").
There are two problems with the provided answer: it converts not from UTC to EST, but instead from EST to UTC (the wrong direction), and it does not take the timezone configured in the users preferences into account. An example follows.
index=* | head 1 | eval time_in = strptime("2016-10-17 00:00:00 UTC", "%F %T %Z") | eval time_adjusted = strptime(strftime(time_in,"%F %T EST"),"%F %T %Z") | eval time_in_printed = strftime(time_in, "%F %T %Z") | eval time_adjusted_printed = strftime(time_adjusted, "%F %T EST") | table time_in_printed, time_adjusted_printed
Assuming my user preferences are set to UTC timezone, then I get the following results:
time_in_printed time_adjusted_printed 2016-10-17 00:00:00 UTC 2016-10-17 05:00:00 EST
This is incorrect. The correct answer would be
2016-10-16 19:00:00 EST, see https://savvytime.com/converter/utc-to-est/00-00. The problem is that it is replacing the timezone information (+00:00) with EST (-05:00), rather than converting it. However, if my timezone is set to +10:00 Brisbane:
time_in_printed time_adjusted_printed 2016-10-17 10:00:00 AEST 2016-10-18 01:00:00 EST
This is a different but also incorrect answer. This is because the conversion method does not account for the timezone adjustment caused by updating the user preferences to anything other than UTC.