Splunk Search

Time Conversion into readble format

ntalwar
New Member

Can someone help me converting 1513554224 into readable time format. I tried couple of formats but not working. I am using this search

| stats count max(_time),min(_time) by user,dest | eval time = strftime (_time, "%m/%d/%y %H:%M:%S")

Tags (1)
0 Karma
1 Solution

niketn
Legend

@ntalwar, once you use max(_time) and min(_time) within transforming command without aliasing to some other fieldname, you will have to use these in your subsequent Splunk search pipes. In your case field _time is not available after stats command.

You can try the following:

<YourBaseSearch>
| stats count max(_time),min(_time) by user,dest 
| fieldformat "latest(_time)" =strftime('latest(_time)',"%Y/%m/%d %H:%M:%S %p %z")
| fieldformat "earliest(_time)" =strftime('earliest(_time)',"%Y/%m/%d %H:%M:%S %p %z")

However, it is better to rename fields after statistical function using as to create alias. Following is what you can try

<YourBaseSearch>
| stats count latest(_time) as Last_Time, earliest(_time) as Earliest_Time by user,dest 
| fieldformat Last_Time =strftime(Last_Time,"%Y/%m/%d %H:%M:%S %p %z")
| fieldformat Earliest_Time =strftime(Earliest_Time,"%Y/%m/%d %H:%M:%S %p %z")

PS: While converting Epoch Time to String Time, I have used YYYY/MM/DD HH:MM:SS AM/PM Timezone so that they keep lexical sorting even as a String time, but you can use a different format if that is a requirement.

eval can also be used instead of fieldformat, however, the difference as listed out in Splunk Documentation/Tutorials is that fieldformat will just apply the changes for display while retaining the original type of field i.e. epoch time. The eval command on the other hand will override the underlying data type to string time as well.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

View solution in original post

niketn
Legend

@ntalwar, once you use max(_time) and min(_time) within transforming command without aliasing to some other fieldname, you will have to use these in your subsequent Splunk search pipes. In your case field _time is not available after stats command.

You can try the following:

<YourBaseSearch>
| stats count max(_time),min(_time) by user,dest 
| fieldformat "latest(_time)" =strftime('latest(_time)',"%Y/%m/%d %H:%M:%S %p %z")
| fieldformat "earliest(_time)" =strftime('earliest(_time)',"%Y/%m/%d %H:%M:%S %p %z")

However, it is better to rename fields after statistical function using as to create alias. Following is what you can try

<YourBaseSearch>
| stats count latest(_time) as Last_Time, earliest(_time) as Earliest_Time by user,dest 
| fieldformat Last_Time =strftime(Last_Time,"%Y/%m/%d %H:%M:%S %p %z")
| fieldformat Earliest_Time =strftime(Earliest_Time,"%Y/%m/%d %H:%M:%S %p %z")

PS: While converting Epoch Time to String Time, I have used YYYY/MM/DD HH:MM:SS AM/PM Timezone so that they keep lexical sorting even as a String time, but you can use a different format if that is a requirement.

eval can also be used instead of fieldformat, however, the difference as listed out in Splunk Documentation/Tutorials is that fieldformat will just apply the changes for display while retaining the original type of field i.e. epoch time. The eval command on the other hand will override the underlying data type to string time as well.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

ntalwar
New Member

Thanks a lots. It worked.

0 Karma

test_qweqwe
Builder

| fieldformat time=strftime(_time, "%c")
Try this.

0 Karma

ntalwar
New Member

Nothing changed. As before. Thanks anyways

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...