Splunk Search

unable to convert epoch fomat to human readable format

kapoorsumit2020
Loves-to-Learn Everything

When i convert following timestamp to human readable format i am getting "12/31/9999 23:59:59" instead of '01/04/22 06:03:47'

"timestamp": 1641294227243

I'm using strftime(timestamp,"%m/%d/%Y %H:%M:%S") function for the conversion.

Could you please help me to find out the right conversion method?

Thanks in advance! 

Labels (1)
0 Karma

johnhuang
Motivator

Splunk assumes the UNIX time value is in seconds and your value is in milliseconds. When this happens, simply divide by 1000.

 

| makeresults
| eval timestamp="1641294227243"
| eval event_time=strftime((timestamp/1000),"%m/%d/%Y %H:%M:%S")

 

0 Karma

kapoorsumit2020
Loves-to-Learn Everything

Thank you @johnhuang 

I tried as suggested but somehow the value for event_time is coming as blank in our use-case:

Here is sample SPL which i'm using:

index=cloud cloudaccount="account_id" sourcetype=lambda:project_name-sftp-logs-to-splunk "\"logGroup\": \"/aws/transfer/s-serverid\"" "CLOSE Path=\/projectname-bucketname-prd" "a.b.c.d.e.T0600420"
| rex field=_raw max_match=0 "(?<Sent_At>\d+)[\r\n ,]+\"message\": \"[\w+.]+ CLOSE Path=\/projectname-bucketname-prd[-secure]*\/landing\/[prd\/]*(?<user_id>\w+)\/(?<FILE_NAME>[\w+.#]+) BytesIn=(?<Bytes_In>\d+)"
| eval event_time=strftime((Sent_At/1000),"%m/%d/%Y %H:%M:%S")
| table FILE_NAME user_id Bytes_In Sent_At event_time

Result:

FILE_NAME                 user_id             Bytes_In     Sent_At                           event_time

----------------------------------------------------------------------------------------------
a.b.c.d.e.T0600420 user_name 1   0                    1641294227380   

Sample log events:

2022-01-04T11:04:02.452Z 9e7cf6d7-006a-4aea-8053-2aae3068f5b0 INFO Decoded payload: {
"messageType": "DATA_MESSAGE",
"owner": "account_id",
"logGroup": "/aws/transfer/s-serverid",
"logStream": "eft_nonprod.7eba87e510990497",
"subscriptionFilters": [
"onepi-sftp-logs-to-splunk-filter"
],
"logEvents": [
{
"id": "36602084357565361730142322725669422549468033150221549568",
"timestamp": 1641294227243,
"message": "message-1"
},
{
"id": "36602084360419857115554242487785994488367023422987042817",
"timestamp": 1641294227371,
"message": "message-2"
},
....
]
}

0 Karma

johnhuang
Motivator

Your data contains multiple timestamps. Let's try to clean up the field a bit:

 

| eval Sent_At=TRIM(MVINDEX(Sent_At,0))
| eval event_time=strftime((Sent_At/1000),"%m/%d/%Y %H:%M:%S")

 

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to November Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...