Splunk Search

Search for a specific pattern and report earliest and latest occurence of that each day

sashpdhar
Explorer

Team -

looking for ideas how to achieve the below scenario

Query 1 - get list of unique patterns for each day

Query 2 - for the list of above patterns get earliest occurence for each day

Query 3 - for the list of above patterns get latest occurence for each day

Report - Day , Pattern , earliest time , latest time

Note - Query 1, 2,3 are all different search queries , as they all are in different log lines

Labels (3)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust
( index=index1 sourcetype=whatever1 Incoming ) OR ( index=index2 sourcetype=whatever2 Internal)
| rex "Incoming: (?<afield>.*])"
| rex "Internal: (?<bfield>.*])"
| eval s_field=coalesce(afield,bfield)
| timechart span=1d earliest_time(afield) as a_time latest_time(bfield) as c_time by s_field

Something like that.

Might need some more tricky conditional field assignment if the Incoming/Internal values aren't mutually exclusive.

View solution in original post

0 Karma

PickleRick
SplunkTrust
SplunkTrust

What do you mean by "patterns"?

0 Karma

sashpdhar
Explorer

trying to achieve something like this @PickleRick 

 

want to report a json pattern for each day and grab event times from different logs for that pattern , tried something like this but not working as expected , need suggestions.

 

Current Query

index="blah" source="*blah*" "Incomming"
| rex "Incomming: (?<s_json>.*])"
| timechart span=1d earliest(_time) as a_time by s_json
| join type=outer s_json
[ search index="blah" source="*blah*" "Internal"
| rex "Internal: (?<s_json>.*])"
| timechart span=1d latest(_time) as c_time by s_json
]
| convert ctime(a_time)
| convert ctime(c_time)
| table _time,s_json,a_time,c_time

 

Expected -

Date1 , j1 , a_time,c_time

Date1,j2,a_time,c_time

Date2,j3,a_time,c_time

Date3,j4,a_time,c_time

Date4,j1,a_time,c_time

Date4,j2,a_time,_ctime

 

Each day can have its own unique patterns ( j1,j2,j3,j4 ... ) , so need to dyamicallly pick that for that day and report a_time and c_time for those patterns each day

Tags (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust
( index=index1 sourcetype=whatever1 Incoming ) OR ( index=index2 sourcetype=whatever2 Internal)
| rex "Incoming: (?<afield>.*])"
| rex "Internal: (?<bfield>.*])"
| eval s_field=coalesce(afield,bfield)
| timechart span=1d earliest_time(afield) as a_time latest_time(bfield) as c_time by s_field

Something like that.

Might need some more tricky conditional field assignment if the Incoming/Internal values aren't mutually exclusive.

0 Karma

sashpdhar
Explorer

Thanks @PickleRick  , that did not provide and format in the expected output

Basically want to grab unique patterns from Incomming and it's a_time , and then want the p_time from Internal for all those patterns ( looks like outer join of two sets but not working )

Below is the query

 

 

index="Blah" source="Blah"  "Incomming"  
|  rex "Incomming: (?<s_json>.*])" 
|  bin _time span=1d
|  stats earliest(_time) as a_time by _time,s_json
|  table _time,s_json,a_time
|  join type=outer
|  [ search index="Blah" source="Blah"  "Internal"  
     |  rex "Internal: (?<s_json>.*])" 
     |  bin _time span=1d
     |  stats latest(_time) as p_time by _time,s_json
     |  table _time,s_json,p_time
   ] 
| convert ctime(a_time)
| convert ctime(p_time)

 

 

 

Getting the below error

Error in 'from' command: Option 'type=outer' is invalid.

 

 

But both individual queries independently give the desired result set  like

 

 

Date1 , j1, [a_time|p_time]
Date1 , j2, [a_time|p_time]
Date2 , j3, [a_time|p_time]
Date2 , j1, [a_time|p_time]

.........

 

 

 

 
 
0 Karma

PickleRick
SplunkTrust
SplunkTrust

The syntactic error is quite obvious ( you have erroneously included a pipe character between the join command and the subsearch).

But I don't quite get how my search differs from what you want to achieve (not _how_ you're trying to do it). Can you show some sample of your data?

0 Karma

sashpdhar
Explorer

thanks for pointing it out @PickleRick  , my first attempt with SPL commands

0 Karma
Get Updates on the Splunk Community!

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...