Splunk Search

How to extract and find the difference in time between two fields using regex?

rrkollip
New Member

Hi ,

I have 2 events like below and I need to find the difference in time between 2 events. There may be a lot of other events between them.
I'm trying to write a regular expression to extract the time in [] brackets and create as an EntryTIME and EXITTIME fields.

Event 1: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry

Event 2: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit 

Please help me in creating regex to extract this in Splunk. I tried field extractor but it's allowing only 1 string to add as a required text. If I do this I can't have 2 fields to get entry time and exit time.

0 Karma
1 Solution

nabeel652
Builder

Try this - You can run this as is copying and pasting it in your search. - you can tweak it a little bit to fit your needs:

| stats count | eval data = "Event 1: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry;Event 1: Worker#115 [SWAPScheduler-INDIA] - [2018-01-31 04:30:09,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry;
  Event 2: Worker#115 [SWAPScheduler-INDIA] - [2018-01-31 04:30:29,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit;
 Event 2: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:34,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit" | makemv delim=";" data 
 | mvexpand data | rename data as _raw  | rex field=_raw ".*Worker\#(?<worker_id>[\d+]*).*\-\s(\[(?<entry_time>[^\]]*).*entry|\[(?<exit_time>[^\]]*).*exit)"

View solution in original post

0 Karma

mayurr98
Super Champion

hey try this run anywhere search

| makeresults 
|  eval _raw="Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry"
 | append
    [ | makeresults 
    |  eval _raw="Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit"] 
| rex field=_raw "\s-\s\[((?<entry_time>[^\]]+).*entry|(?<exit_time>[^\]]+).*exit)"

in your environment, you should write

<your_base_search> | rex field=_raw "\s-\s\[((?<entry_time>[^\]]+).*entry|(?<exit_time>[^\]]+).*exit)"

let me know if this helps!

0 Karma

493669
Super Champion

are you expecting something like this:

... | rex field=_raw "(.*\]\s\-\s\[(?<entry_time>[^\,]*).*executeInternal\sentry)|(.*\]\s\-\s\[(?<exit_time>[^\,]*).*executeInternal\sexit)"
0 Karma

nabeel652
Builder

Try this - You can run this as is copying and pasting it in your search. - you can tweak it a little bit to fit your needs:

| stats count | eval data = "Event 1: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry;Event 1: Worker#115 [SWAPScheduler-INDIA] - [2018-01-31 04:30:09,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry;
  Event 2: Worker#115 [SWAPScheduler-INDIA] - [2018-01-31 04:30:29,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit;
 Event 2: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:34,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit" | makemv delim=";" data 
 | mvexpand data | rename data as _raw  | rex field=_raw ".*Worker\#(?<worker_id>[\d+]*).*\-\s(\[(?<entry_time>[^\]]*).*entry|\[(?<exit_time>[^\]]*).*exit)"
0 Karma

nabeel652
Builder

Please check the updated answer

0 Karma

rrkollip
New Member

This worked to get the entry & exit time. When I run query like below to calculate the difference, none of the strptime and strftime functions works with fields entry_time and exit_time.

My base search | rex field=_raw ".Worker#(?[\d+]).-\s([(?[^]]).entry|[(?[^]]).*exit)" | eval entry=strptime(entry_time,"%d/%m/%Y-%H:%M:%S") | eval exit=strptime(exit_time,"%d/%m/%Y-%H:%M:%S") | eval diff = exit-entry

Does strptime and strftime functions works with the fields extracted from regex?

0 Karma

nabeel652
Builder

Well for strptime you'll have to provide the timeformat string that matches the date/time in entry_time and exit_time. So you need to do:

| eval exit = strptime(exit_time,"%Y-%m-%d %H:%M:%S")

However by doing |eval diff = exit - entry you'll not get any useful results because entry_time corresponding to every exit_time (and vise versa) are NULL.

In order to get time difference between entry and exit you'll have to use the transaction command that I worked out earlier.

0 Karma

nabeel652
Builder

If you still want duration through same query use this:

    | stats count | eval data = "Event 1: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:04,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry;Event 1: Worker#115 [SWAPScheduler-INDIA] - [2018-01-31 04:30:09,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal entry;
   Event 2: Worker#115 [SWAPScheduler-INDIA] - [2018-01-31 04:30:29,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit;
  Event 2: Worker#108 [SWAPScheduler-INDIA] - [2018-01-31 04:30:34,340] - DEBUG - AppLogger: SWAPScheduler.executeInternal exit" | makemv delim=";" data 
  | mvexpand data | rename data as _raw | rex field=_raw ".*Worker\#(?[\d+]*).*\-\s(\[(?[^\]]*).*entry|\[(?[^\]]*).*exit)"  | stats first(entry_time) as entry last(exit_time) as exit by worker_id | eval entry = strptime(entry,"%Y-%m-%d %H:%M:%S") | eval exit = strptime(exit,"%Y-%m-%d %H:%M:%S") | eval diff = round(exit -entry,0)
0 Karma
Get Updates on the Splunk Community!

What's New in Splunk Cloud Platform 9.2.2403?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.2.2403! Analysts can ...

Stay Connected: Your Guide to July and August Tech Talks, Office Hours, and Webinars!

Dive into our sizzling summer lineup for July and August Community Office Hours and Tech Talks. Scroll down to ...

Edge Processor Scaling, Energy & Manufacturing Use Cases, and More New Articles on ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...