Splunk Search

unable to convert epoch fomat to human readable format

kapoorsumit2020
Loves-to-Learn Everything

When i convert following timestamp to human readable format i am getting "12/31/9999 23:59:59" instead of '01/04/22 06:03:47'

"timestamp": 1641294227243

I'm using strftime(timestamp,"%m/%d/%Y %H:%M:%S") function for the conversion.

Could you please help me to find out the right conversion method?

Thanks in advance! 

Labels (1)
0 Karma

johnhuang
Motivator

Splunk assumes the UNIX time value is in seconds and your value is in milliseconds. When this happens, simply divide by 1000.

 

| makeresults
| eval timestamp="1641294227243"
| eval event_time=strftime((timestamp/1000),"%m/%d/%Y %H:%M:%S")

 

0 Karma

kapoorsumit2020
Loves-to-Learn Everything

Thank you @johnhuang 

I tried as suggested but somehow the value for event_time is coming as blank in our use-case:

Here is sample SPL which i'm using:

index=cloud cloudaccount="account_id" sourcetype=lambda:project_name-sftp-logs-to-splunk "\"logGroup\": \"/aws/transfer/s-serverid\"" "CLOSE Path=\/projectname-bucketname-prd" "a.b.c.d.e.T0600420"
| rex field=_raw max_match=0 "(?<Sent_At>\d+)[\r\n ,]+\"message\": \"[\w+.]+ CLOSE Path=\/projectname-bucketname-prd[-secure]*\/landing\/[prd\/]*(?<user_id>\w+)\/(?<FILE_NAME>[\w+.#]+) BytesIn=(?<Bytes_In>\d+)"
| eval event_time=strftime((Sent_At/1000),"%m/%d/%Y %H:%M:%S")
| table FILE_NAME user_id Bytes_In Sent_At event_time

Result:

FILE_NAME                 user_id             Bytes_In     Sent_At                           event_time

----------------------------------------------------------------------------------------------
a.b.c.d.e.T0600420 user_name 1   0                    1641294227380   

Sample log events:

2022-01-04T11:04:02.452Z 9e7cf6d7-006a-4aea-8053-2aae3068f5b0 INFO Decoded payload: {
"messageType": "DATA_MESSAGE",
"owner": "account_id",
"logGroup": "/aws/transfer/s-serverid",
"logStream": "eft_nonprod.7eba87e510990497",
"subscriptionFilters": [
"onepi-sftp-logs-to-splunk-filter"
],
"logEvents": [
{
"id": "36602084357565361730142322725669422549468033150221549568",
"timestamp": 1641294227243,
"message": "message-1"
},
{
"id": "36602084360419857115554242487785994488367023422987042817",
"timestamp": 1641294227371,
"message": "message-2"
},
....
]
}

0 Karma

johnhuang
Motivator

Your data contains multiple timestamps. Let's try to clean up the field a bit:

 

| eval Sent_At=TRIM(MVINDEX(Sent_At,0))
| eval event_time=strftime((Sent_At/1000),"%m/%d/%Y %H:%M:%S")

 

0 Karma
Get Updates on the Splunk Community!

See just what you’ve been missing | Observability tracks at Splunk University

Looking to sharpen your observability skills so you can better understand how to collect and analyze data from ...

Weezer at .conf25? Say it ain’t so!

Hello Splunkers, The countdown to .conf25 is on-and we've just turned up the volume! We're thrilled to ...

How SC4S Makes Suricata Logs Ingestion Simple

Network security monitoring has become increasingly critical for organizations of all sizes. Splunk has ...