Splunk Cloud Platform

How to find earliest time when there is a gap in events

knanaiah001
Explorer

Hi Team ,

How to get below output using Splunk SPL query from below input .

INPUT :

_timeurlscannedissues
1-Febabc.comissue1
1-Febabc.comissue2
1-Febabc.comissue3
1-Febabc.comissue4
5-Febabc.comissue1
5-Febabc.comissue3
5-Febabc.comissue4
7-Febabc.comissue1
7-Febabc.comissue3
10-Febabc.comissue1
10-Febabc.comissue2
10-Febabc.comissue3
14-Febabc.comissue1
14-Febabc.comissue2
14-Febabc.comissue5

 

Expected OUTPUT   
urlscannedissuesLatestTimeEarliest Time
abc.comissue114-Feb1-Feb
abc.comissue214-Feb10-Feb
abc.comissue514-Feb14-Feb

 

can someone guide on the SPL command logic to achieve above output.
Thanks in advance!


Labels (2)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| makeresults 
| eval _raw="_time	url	scannedissues
1-Feb	abc.com	issue1
1-Feb	abc.com	issue2
1-Feb	abc.com	issue3
1-Feb	abc.com	issue4
5-Feb	abc.com	issue1
5-Feb	abc.com	issue3
5-Feb	abc.com	issue4
7-Feb	abc.com	issue1
7-Feb	abc.com	issue3
10-Feb	abc.com	issue1
10-Feb	abc.com	issue2
10-Feb	abc.com	issue3
14-Feb	abc.com	issue1
14-Feb	abc.com	issue2
14-Feb	abc.com	issue5"
| multikv forceheader=1
| table time url scannedissues
``` the lines above set up sample data ```
``` count the scans using the date in the time field ```
| streamstats dc(time) as scan
``` find the range of scans between two consecutive reports by url and issue ```
| streamstats range(scan) as interval global=f window=2 by url scannedissues
``` find start dates of sequences ```
| eval sequence_start=if(interval=1,null(),time)
``` find last start date by url scannedissues ```
| eventstats last(sequence_start) as sequence_start by url scannedissues
``` find last scan date ```
| eventstats last(time) as lasttime
``` keep events from last scan ```
| where time=lasttime

knanaiah001
Explorer

@ITWhisperer  @somesoni2 ,
Thank you helping me out .Some minor changes I added  with below provided query and it worked for us.
Thanks a lot!

0 Karma

somesoni2
Revered Legend

Try something like this

*Updated

 

Your current search with fields _time, url, scanned issues
| eval date=strftime(_time,"%F") 
| eventstats latest(date) as LatestTime
| eventstats min(_time) as EarliestTime by url issues
| where date=LatestTime | eval EarliestTime=strftime(_time,"%F")
| table url scannedissues LatestTime EarliestTime

 

knanaiah001
Explorer

Hi @somesoni2 ,

How are we getting recentDate fron below query?

| where date=recentDate 

 
Thanks!

0 Karma

somesoni2
Revered Legend

I apologize. The recentDate was the field name I was using earlier but forgot the update the reference. Please try the updated query.

0 Karma

somesoni2
Revered Legend

Could you please add more details on what  your logic is for the final output? 

0 Karma

knanaiah001
Explorer

Hi @somesoni2 ,

Thank you for the response: )
How many latest issues are reported whiling scanning an url in all time (example from feb 1 to feb14) , which are issue1,issue2& issue5 and what should be the earliest scan date for all 3 issues?

for issue1- earliest is feb1 because it has occurred in all previous scandates(10feb,7feb,5freb,1feb)
for issue2 -earliest is feb10th because it was not reported on feb7,feb5 (this is the gap in date)which is considered as resolved.
for issue5- earliest is feb14th because issue was reported for first time on feb14th itself.

Hope now its clear,Let me know if you need anything further .

Thanks!




0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...