Hello everyone!
In my company, we have Splunk (version 6.0) recording log information about data sent by remote devices (surveillance cameras) to our server.
Both our servers and the cameras are configured to send and record data with UTC timestamps. All the times shown in the log messages (the ones that are recorded by Splunk) are in UTC. Splunk's _time
is UTC... Everything UTC! Yay!
BUT... when I perform a search, the times are converted to US/Eastern, which is MY timezone. Is there a way to force (through the Search string, not altering Splunk's configuration files or the user settings) the times to be displayed in UTC?
Let me show you an example:
Let's say the log message shows something like:
hostname=server01 server_time=2015-03-27_16:19:00 Time mismatch error: camera_time=1970-01-01_11:00:00 sent by camera_id='123foo' is out of date.
I know for a fact that both server_time
and camera_time
on the message above are in UTC. But when I run a search to show a table
with all the cameras that have outdated timestamps, that time is converted to my timezone (US/Eastern).
hostname=server* earliest=-24h "Time mismatch error" | sort -_time | dedup 1 camera_id | eval server_time=strptime(server_time, "%Y-%m-%d_%H:%M:%S") | eval camera_time=strptime(camera_time, "%Y-%m-%d_%H:%M:%S") | eval server_time=strftime(server_time,"%Y-%m-%d %T %Z") | eval camera_time=strftime(camera_time,"%Y-%m-%d %T %Z") | table camera_id server_time camera_time
Shows:
+--------------+-------------------------+-------------------------+
| camera_id | server_time | camera_time |
+--------------+-------------------------+-------------------------+
| 123foo | 2015-03-27 12:19:00 EDT | 1970-01-01 06:00:00 EST |
+--------------+-------------------------+-------------------------+
I'd like to keep the two time values in UTC, if possible, not applying the -5.00
(or -4.00
now that the East coast is in Daylight Savings Time) hours correction through... something in the search string, so my table would look something like:
+--------------+-------------------------+-------------------------+
| camera_id | server_time | camera_time |
+--------------+-------------------------+-------------------------+
| 123foo | 2015-03-27 16:19:00 UTC | 1970-01-01 11:00:00 UTC |
+--------------+-------------------------+-------------------------+
The most "desperate" thing I've tried is "fooling" strptime
to "artifically" stick a UTC string in it:
eval camera_time=strptime(strftime(strptime(camera_time, "%Y-%m-%d_%H:%M:%S"), "%Y-%m-%d_%H:%M:%S UTC") "%Y-%m-%d_%H:%M:%S %Z")
Read as: Get the camera_time
, make a date
out of it using strptime
, then convert that back to an string but ending it with 'UTC' this time, and make a date
out of that string again.
Didn't work. I still see Eastern in the table.
As I mentioned before I saw this other thread (http://answers.splunk.com/answers/41585/display-time-in-utc.html ) in which the only (and accepted) answer talks about changing the user settings, but I'd like to avoid that. I want to show the times in UTC only for this particular search. Is that possible? It looks like something that should be very doable, since all the Splunk times are UTC, but I haven't been able to figure out how.
Thank you in advance.
Splunk's stores an event's time as an epochtime value, ie as the number of seconds since 1/1/1970, and no timezone information is stored with it at all. Before that, as the event is indexed, when a string formatted time is encountered in the raw data, Splunk of course relies on it's configuration to tell what timezone it should interpret this string as, before it converts it to an epochtime value.
Then much later when the event is displayed in the Splunk UI, splunk will at that moment convert the _time value from epochtime (big number of seconds), to a string formatted time. Here of course it needs to pick a timezone again and what it picks is the timezone of the Search Head.
I think it's worth noting that UTC can sometimes be misinterpreted as a synonym for "epochtime", which it is not. UTC is a timezone, basically GMT with no daylight saving time ever. Sometimes you'll also come across the idea that "epochtime is in UTC" which is nonsensical cause an epochtime is just a number of seconds.
Anyway, it's not uncommon for a whole splunk deployment to have everything including search heads, living in the UTC timezone. In my experience this is extremely confusing for many of the users, but it does work as advertised; all displayed times would be in the UTC (ie GMT) timezone.
And if you just want it to work on this search, knowing the above, you can see now that the space of hacks you've already discovered are pretty much what you need to do. To do it automatically for that one search, as well as reliably throughout the year incorporating the changing DST offset is a challenge.
What you could do, at least for timezones west of GMT to the date line, is use search language like this, possibly hiding it in a macro called like "utc_timezone_hack"
<your search > | eval hourAndMinuteOffset=split(strftime(_time, "%k:%M"),":") | eval hourOffset=mvindex(hourAndMinuteOffset,0) | eval minuteOffset=mvindex(hourAndMinuteOffset,1) | eval hourOffset=24-hourOffset | eval offsetSeconds=hourOffset*60 + minuteOffset | eval _time=_time-offsetSeconds
What that search language does, is a bit of a hack, but it will get the number of seconds by which your timezone differs from UTC, and manually subtract those from your events _time values. For others reading this whose locations are east of GMT to the date line, I think you'd want to remove that "eval hourOffset=24-hourOffset" line.
With the help of the excellent answers from @sideview and @jmccr78 I came up with what I believe to be an improved solution (simplified and robust).
eval _time_UTC = _time
- (strptime("2000-01-01 +00:00", "%F %:z") - strptime("2000-01-01 " . strftime(_time, "%:z"), "%F %Z"))
| eval time_in_UTC = strftime(_time_UTC, "%F %T UTC")
This works by computing the UTC offset of the timezone configured in the user preferences and subtracting it from the time (similar to the other answers). However the timezone offset is calculated simply by using a reference date (I chose 2000-01-01 arbitrarily) and parsing it twice, first using the UTC timezone and then the users configured timezone, and the difference of the two yields the timezone offset.
This method can easily be extended to support converting to an arbitrary timezone.
eval _timezone = "AEST"
| eval _time_AEST = _time
- (strptime("2000-01-01 +00:00", "%F %:z") - strptime("2000-01-01 " . strftime(_time, "%:z"), "%F %Z"))
+ (strptime("2000-01-01 +00:00", "%F %:z") - strptime("2000-01-01 " . _timezone, "%F %Z"))
| eval time_in_AEST = strftime(_time_AEST, "%F %T " . _timezone)
This works by first subtracting the UTC offset of the timezone as configured in the users preferences, then adds on the UTC offset of the timezone you configure (in this example AEST
, but could be -05:30
or any valid Splunk timezone identifier).
A couple of things to remember:
_time_AEST
variable as seconds since epoch (1970-01-01 00:00:00 UTC), and so technically Splunk is interpreting this as a different 'real world' time -- if you attempt to print the timezone of the date, it will incorrectly report the users configured timezone. Instead, whenever printing this date always include the timezone manually and don't use the %Z
timezone formats (as is done in the last line of the example)."
quotes stripped when passed as arguments to a macro)._time
property). For our use case, we are uploading the 'real world time' using time since Epoch (assigned to the _time
property), and additionally upload a timestamp
formatted as a date in local time field which Splunk interpets as a string. This gives Splunk enough information to assign the correct time to the event, but also allows us to run queries against the local time where the data is sourced from.The problem with this solution is that the difference between timezones is not static, given that DSTs apply at different moments in time.
I was faced with the same problem recently and I solved it by writing the following macro:
[strftime_utc(2)]
args = field, format
definition = "strftime($field$ - (strptime(strftime($field$, \"%Y-%m-%dT%H:%M:%SZ\"), \"%Y-%m-%dT%H:%M:%S%Z\")-strptime(strftime($field$, \"%Y-%m-%dT%H:%M:%S\"), \"%Y-%m-%dT%H:%M:%S\")), \"$format$\")"
iseval = 1
So you can now write a search that looks like this:
index=main | eval utc_time=`strftime_utc(_time, "%Y-%m-%dT%H:%M:%SZ")`
Regardless of what the timezone is on each event, this will cause the output to be in UTC.
Thanks to @richgalloway for the initial suggestion that lead to this.
I kept having the same issue so the following is what I put together:
<your search> | eval time_splunk=strftime(_time, "%A, %B %e, %Y %l:%M:%S.%3Q %p %Z (%:z)") | eval time_offset=strftime(_time, "%:z") | rex field=time_offset ".(?<time_offset_seconds>\d{2}:\d{2})" | eval time_offset_seconds=time_offset_seconds.":00" | convert dur2sec(time_offset_seconds) | eval time_utc_epoch=strftime(_time, "%s") | convert num(time_utc_epoch) | eval time_utc_epoch=if(time_offset_seconds==0, time_utc_epoch, if(substr(time_offset, 1, 1)=="+", time_utc_epoch-time_offset_seconds, time_utc_epoch+time_offset_seconds)) | eval time_utc=strftime(time_utc_epoch, "%A, %B %e, %Y %l:%M:%S.%3Q %p UTC (-00:00)")
This will display as (for example):
time_utc="Friday, December 18, 2015 7:15:43.000 AM UTC (-00:00)"
time_splunk="Friday, December 18, 2015 12:15:43.000 AM MST (-07:00)"
I hope it helps some of you as well!,I kept having the same issue so the following is what I put together:
<your search> | eval time_splunk=strftime(_time, "%A, %B %e, %Y %l:%M:%S.%3Q %p %Z (%:z)") | eval time_offset=strftime(_time, "%:z") | rex field=time_offset ".(?<time_offset_seconds>\d{2}:\d{2})" | eval time_offset_seconds=time_offset_seconds.":00" | convert dur2sec(time_offset_seconds) | eval time_utc_epoch=strftime(_time, "%s") | convert num(time_utc_epoch) | eval time_utc_epoch=if(time_offset_seconds==0, time_utc_epoch, if(substr(time_offset, 1, 1)=="+", time_utc_epoch-time_offset_seconds, time_utc_epoch+time_offset_seconds)) | eval time_utc=strftime(time_utc_epoch, "%A, %B %e, %Y %l:%M:%S.%3Q %p UTC (-00:00)")
I hope it helps some of you as well!
Thank you for your excellent answer. I removed | eval time_utc_epoch=strftime(_time, "%s")
and instead used
| convert num(time_utc_epoch)_time
directly in the last | eval time_utc_epoch=
expression (as _time
is already in epoch, no need to convert it again).
I also removed the outer if
from the last | eval time_utc_epoch=
expression, as adding/subtracting zero would have no effect.
Splunk's stores an event's time as an epochtime value, ie as the number of seconds since 1/1/1970, and no timezone information is stored with it at all. Before that, as the event is indexed, when a string formatted time is encountered in the raw data, Splunk of course relies on it's configuration to tell what timezone it should interpret this string as, before it converts it to an epochtime value.
Then much later when the event is displayed in the Splunk UI, splunk will at that moment convert the _time value from epochtime (big number of seconds), to a string formatted time. Here of course it needs to pick a timezone again and what it picks is the timezone of the Search Head.
I think it's worth noting that UTC can sometimes be misinterpreted as a synonym for "epochtime", which it is not. UTC is a timezone, basically GMT with no daylight saving time ever. Sometimes you'll also come across the idea that "epochtime is in UTC" which is nonsensical cause an epochtime is just a number of seconds.
Anyway, it's not uncommon for a whole splunk deployment to have everything including search heads, living in the UTC timezone. In my experience this is extremely confusing for many of the users, but it does work as advertised; all displayed times would be in the UTC (ie GMT) timezone.
And if you just want it to work on this search, knowing the above, you can see now that the space of hacks you've already discovered are pretty much what you need to do. To do it automatically for that one search, as well as reliably throughout the year incorporating the changing DST offset is a challenge.
What you could do, at least for timezones west of GMT to the date line, is use search language like this, possibly hiding it in a macro called like "utc_timezone_hack"
<your search > | eval hourAndMinuteOffset=split(strftime(_time, "%k:%M"),":") | eval hourOffset=mvindex(hourAndMinuteOffset,0) | eval minuteOffset=mvindex(hourAndMinuteOffset,1) | eval hourOffset=24-hourOffset | eval offsetSeconds=hourOffset*60 + minuteOffset | eval _time=_time-offsetSeconds
What that search language does, is a bit of a hack, but it will get the number of seconds by which your timezone differs from UTC, and manually subtract those from your events _time values. For others reading this whose locations are east of GMT to the date line, I think you'd want to remove that "eval hourOffset=24-hourOffset" line.