Getting Data In

Convert multiple fields as timestamp in csv file

chintan_shah
Path Finder

I have a file with multiple fields as timestamp in the format of "Oct 2 2017 1:22:21:000PM". Can someone suggest how to convert it into timestamp so that i can perform logical operations on them?
Can we update props/transforms command so that i can update it once and use the formats for all fields?

Tags (2)
0 Karma
1 Solution

yannK
Splunk Employee
Splunk Employee

If you want to define the event timestamp based on those fields at index time.
Then you want to go on the "very first" forwarder monitoring those csv files, and setup a sourcetype with a definition .

see http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileswithstructureddata
something like :

# in props.conf
[customcsv]
INDEXED_EXTRACTIONS=csv
TIMESTAMP_FIELDS = myfieldcolumnname
# or if the timestamp is over several columns
# TIMESTAMP_FIELDS = fieldhour,fielddate,fieldyear

 # in inputs.conf
[monitor://path/to/my/csv/file.csv]
sourcetype=customcsv

If your events are already indexed, and you just want to generate a timestamp on the fly at search time from other fields.
you can 1
1 - extract the field (a rex command can do the trick) or maybe the field is already extracted as a string.
2 - use an eval or convert function to parse it as a timestamp (and convert to epoch time or any format you want)

see http://docs.splunk.com/Documentation/Splunk/7.0.0/SearchReference/Convert
and details on timeformat options http://docs.splunk.com/Documentation/Splunk/7.0.0/Data/Configuretimestamprecognition#Enhanced_strpti...

Example with fake event:

| stats count | eval mytimefield="1:22:21:000PM" | eval mydatefield="Oct 2 2017" 
| eval mycombinedtime=mydatefield." ".mytimefield 
| convert TIMEFORMAT="%b %d %Y %I:%M:%S:%3N%p" mktime(mycombinedtime) AS myepochtime
| convert TIMEFORMAT="%Y/%m/%d %H:%M:%S.%3N %Z" ctime(myepochtime) AS anotherformatedtime 
| table mytimefield mydatefield mycombinedtime myepochtime anotherformatedtime

It will return something like :
mytimefield 1:22:21:000PM
mydatefield Oct 2 2017
mycombinedtime Oct 2 2017 1:22:21:000PM
myepochtime 1506975741.000
anotherformatedtime 2017/10/02 13:22:21.000 PDT
(we converted to a more valid format, with 24h clock and timezone)

View solution in original post

yannK
Splunk Employee
Splunk Employee

If you want to define the event timestamp based on those fields at index time.
Then you want to go on the "very first" forwarder monitoring those csv files, and setup a sourcetype with a definition .

see http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileswithstructureddata
something like :

# in props.conf
[customcsv]
INDEXED_EXTRACTIONS=csv
TIMESTAMP_FIELDS = myfieldcolumnname
# or if the timestamp is over several columns
# TIMESTAMP_FIELDS = fieldhour,fielddate,fieldyear

 # in inputs.conf
[monitor://path/to/my/csv/file.csv]
sourcetype=customcsv

If your events are already indexed, and you just want to generate a timestamp on the fly at search time from other fields.
you can 1
1 - extract the field (a rex command can do the trick) or maybe the field is already extracted as a string.
2 - use an eval or convert function to parse it as a timestamp (and convert to epoch time or any format you want)

see http://docs.splunk.com/Documentation/Splunk/7.0.0/SearchReference/Convert
and details on timeformat options http://docs.splunk.com/Documentation/Splunk/7.0.0/Data/Configuretimestamprecognition#Enhanced_strpti...

Example with fake event:

| stats count | eval mytimefield="1:22:21:000PM" | eval mydatefield="Oct 2 2017" 
| eval mycombinedtime=mydatefield." ".mytimefield 
| convert TIMEFORMAT="%b %d %Y %I:%M:%S:%3N%p" mktime(mycombinedtime) AS myepochtime
| convert TIMEFORMAT="%Y/%m/%d %H:%M:%S.%3N %Z" ctime(myepochtime) AS anotherformatedtime 
| table mytimefield mydatefield mycombinedtime myepochtime anotherformatedtime

It will return something like :
mytimefield 1:22:21:000PM
mydatefield Oct 2 2017
mycombinedtime Oct 2 2017 1:22:21:000PM
myepochtime 1506975741.000
anotherformatedtime 2017/10/02 13:22:21.000 PDT
(we converted to a more valid format, with 24h clock and timezone)

chintan_shah
Path Finder

@yannK,

Currently I am not able to convert this time "Oct 2 2017 1:22:21:000PM" to epochtime? I am trying the below query but its not providing desired results
convert timeformat="%b %d %Y %h:%M:%S:%3N%p" mktime(EFF_DT) as epochtime|table epochtime

0 Karma

yannK
Splunk Employee
Splunk Employee

Your hour is on a 12h clock, please use %I, instead of %H that is for 24 clock.

    | stats count | eval EFF_DT="Oct 2 2017 1:22:21:000PM"
    | convert timeformat="%b %d %Y %I:%M:%S:%3N%P" mktime(EFF_DT) AS epochtime

chintan_shah
Path Finder

@yannK,

Could you please tell me if its possible, to convert this epoch time in readable format in the same query so that i can use the string in Calculated Fields?
I have below query which works fine but cant use it for creating calculated fields
|convert mktime(EFF_DT) as epoch_EFF_DT timeformat="%b %d %Y %I:%M:%S:%3N%P"|eval EFF_DATE=strftime(epoch_EFF_DT,"%m/%d/%y %H:%M:%S")

0 Karma

yannK
Splunk Employee
Splunk Employee

calculated fields requires "eval" commands
hopefully, there is an eval equivalent to convert ctime or mktime -> strftime(X,Y) and strptime(X,Y)

see http://docs.splunk.com/Documentation/Splunk/7.0.0/SearchReference/DateandTimeFunctions

0 Karma
Get Updates on the Splunk Community!

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...

Observability protocols to know about

Observability protocols define the specifications or formats for collecting, encoding, transporting, and ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...