i want to log activity of what a script is doing into an index.
i'm creating current time with a datetime
object in python:
now = datetime.datetime.now()
now.strftime("%Y-%m-%d %H:%M:%S %Z")
now i do know strftime has limited support for timezones... but without a timezone the log message might not get parsed correctly AFAIK, and is probably not a Good Thing to leave it off.
how can i access what Splunk thinks the host server's timezone is (independent of individual user settings, etc)? whether it's REST endpoint in Splunk, an SDK variable / function, or some native Python function doesn't matter to me.
If you have done the right thing and configured a TZ=
value in your props.conf
for every source (except for some scripted inputs from TA apps), then you can check the date_zone
field to see what Splunk thinks is the TZ for each event. If you see local
then this means that the TZ of the indexer (or heavy forwarder) was used (which is usually bad/wrong). You are mistaken to think that Splunk "has a TZ" because it doesn't; all Splunk times are stored in GMT-based epoch (TZ=0) so you should never need to do any TZ manipulation if you are dealing with the _time
field.
thanks but i'm mostly just looking for a pythonic way to access the local system's timezone.
i'm writing logs to ~/var/log/splunk/foo.log
which lands in the server's _internal
index. it might be redundant to log a TZ, but i still am curious what the answer is.
i should also clarify that i'd like to do it natively as much as possible, without requiring a 3rd party lib like pytz
to be bundled with my add-on.