Splunk Search

How to use the earliest and latest value found (passed from subsearch to outer search) to calculate duration?

namrithadeepak
Path Finder

I have a batch job that may run multiple times per day.
The log format is as follows,

alt text

I need a table with the below columns for the latest run of the job,

  1. Duration per event in the latest run
  2. No. of events in the latest run - already have query

I have the query to calculate the No. of events in the latest run.

app="myapp" source="*mylogs*" 
    [ search app="myapp" source="*mylogs*" ("*Started job*" OR "*Job Completed*") 
    | eval startTime=strftime(case(like(_raw,"%Started job%"),_time),"%m/%d/%Y:%H:%M:%S") 
    | eval endTime=strftime(case(like(_raw,"%Job Completed%"),_time),"%m/%d/%Y:%H:%M:%S") 
    | eval date=strftime(case(like(_raw,"%Job Completed%") OR like(_raw,"%Started job%"),_time),"%Y-%m-%d") 
    | stats earliest(startTime) as earliest, latest(endTime) as latest by date 
    | sort -date, -latest 
    | head 1 
    | return earliest, latest] 
| chart count

Steps:
INSIDE SUBSEARCH:
1. Assume events with "Starting job" as start of job => extract startTime, date
2. Assume events with "Ending job" as end of job => extract endTime, date
3. Create table date,startTime, endTime using stats command
4. Sort in descending order based on date, endTime
5. Take the first entry(using head)
6. Return earliest and latest value so that it gets applied to the outer search

IN THE OUTER SEARCH:
1. After earliest, latest value is automatically applied, use chart count(_raw) or stats count(_raw)

What I need:
How do I use the earliest and latest (passed from subsearch to outer search) to calculate duration?
(duration=latest-earliest)

Thanks in advance!!

0 Karma
1 Solution

namrithadeepak
Path Finder

I found the answer myself. You can do it using eventstats.

Please close this question.

View solution in original post

0 Karma

namrithadeepak
Path Finder

I found the answer myself. You can do it using eventstats.

Please close this question.

0 Karma
Get Updates on the Splunk Community!

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...