Getting Data In

Formatting timestamp field extracted from CSV file for post processing

gaurav10
Engager

Hi,

I'm currently ingesting CSV files to Splunk. One of the fields record actual Event Timestamp in this format YYYYmmddHHMMSS (e.g. 20240418142025). I need to format this field's value in a way that Splunk will understand the data (e.g. date, hour, minutes, second etc.). Once this formatting is complete, I need to sort these time stamps/events for each Second (e.g. bucket span=1s Event_Time). Note here Event_Time is the formatted data from original Event Timestamp field.

So far, I've tried this:

index=test1 sourcetype=test2 | eval Event_Time=strftime(strptime(SUBMIT_TIME,"%Y%m%d%H%M%S"), "%m/%d/%y %H:%M:%S") | table Event_Time

Above command gives me decent output such as 04/18/24 14:20:25.

But, when I try to group values of Event_Time using "bucket span=1s Event_Time", it does not do anything. Note that "bucket span=1s _time" works as I'm using Splunk default time field.

Appreciate any help to make this formatting work for post processing Event_Time.

Thank you in advance.

Labels (2)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Ading to @ITWhisperer 's answer - ideally you should have your main time field for a given event parsed on ingestion to the _time field so that Splunk can effectively search your data and "organize it" timewise.

ITWhisperer
SplunkTrust
SplunkTrust

In order to bin the Event time, you need to keep it as a number (after parsing with strptime). You can format it as a string later or use fieldformat for display purposes

 

index=test1 sourcetype=test2 | eval Event_Time=strptime(SUBMIT_TIME,"%Y%m%d%H%M%S") | table Event_Time
``` This next line is redundant since you only have Event_Time to the nearest second anyway ```
| bin Event_Time span=1s
| sort 0 Event_Time
| fieldformat Event_Time=strftime(Event_Time, "%m/%d/%y %H:%M:%S")

 

gaurav10
Engager

Thank you @ITWhisperer . I used your recommended query as below but unable to get any output:

index=test1 sourcetype=test2 EVENT A | bin Event_Time span=1s | sort 0 Event_Time | fieldformat Event_Time=strftime(Event_Time, "%m/%d/%y %H:%M:%S")


Please see below my old Splunk query being used using Splunk default "_time" field.

index=test1 sourcetype=test2 EVENT A | bucket span=1s _time | stats count AS EventPerSec by _time | timechart span=1d max(EventPerSec)

Ultimately, in this query, I want to replace "_time" by "Event_Time" that is more accurate than "_time". 

Note that there can be multiple events in my data occurring at the exact same time (to the Seconds value). So basically, my query find the peak "EventPerSec" value in 1 day.

Hope this explanation helps.

0 Karma

P_vandereerden
Splunk Employee
Splunk Employee

@gaurav10 Note that in @ITWhisperer 's solution, Event_Time is handled in 2 steps, with the binning in between:

First, convert SUBMIT_TIME to a time field using strptime.

Now you can bin based on a time span. Do your binning in this in-between phase.

Second, convert the new Event_Time to a string using strftime.

Paul van der Eerden,
Breaking software for over 20 years.
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

That doesn't appear to be what I recommended - perhaps that's why you are not getting any results?

It would help if you could share some sample anonymised events so we can see what it is that you are dealing with and try to figure a search that will work for you, because just discussing searches without knowing what they apply to is often fruitless.

0 Karma
Get Updates on the Splunk Community!

Transforming Financial Data into Fraud Intelligence

Every day, banks and financial companies handle millions of transactions, logins, and customer interactions ...

How to send events & findings from AWS to Splunk using Amazon EventBridge

Amazon EventBridge is a serverless service that uses events to connect application components together, making ...

Exciting News: The AppDynamics Community Joins Splunk!

Hello Splunkers,   I’d like to introduce myself—I’m Ryan, the former AppDynamics Community Manager, and I’m ...