Splunk Search

How to select yesterdays date from the following search?

damucka
Builder

Hello,

I know it is a simple question but I am somehow struggling with it.
I have the following search:

index=mlbso sourcetype=ISP_abaptraces ( mtx OR mmx OR mm_diagmode OR sigigenaction OR thierrhandle OR mutex OR "ca blocks" ) AND (WARNING OR ERROR) 
earliest=-90d@d latest=now  
| rename comment AS "1.) ------------------------- SPL Search string based on the expert domain knowledge --------------------------"   

| rex field=_raw "^(?<firstLine>.*)\n.(?<remainingLines>[\s\S]*)$"
| eval text=replace(remainingLines,"\d{0}\d+","")
|rename comment AS "2.) ------------------------- Pre-parsing: get rid of first line with date / time and the digits ---------------"

| anomalies field=text
| rename comment AS "3.) ------------------------- Anomalies -----------------------------------------------------------------------"

| table _time _raw text unexpectedness
| sort 3 -unexpectedness
| cluster field=text

The idea is to find unexpected logs from last 90 days.
Now I would like to return only those, that have _time in the last 24 hours.
Important is, that I need to analyze the 90 days first (for the input to the "anomalies" command) but at the end filter the output somehow smartly ("where" command?) only to those from the last day.

How would I do it?

Kind Regards,
Kamil

0 Karma
1 Solution

chrisyounger
SplunkTrust
SplunkTrust

You can use the relative_time function in eval like so:

 index=mlbso sourcetype=ISP_abaptraces ( mtx OR mmx OR mm_diagmode OR sigigenaction OR thierrhandle OR mutex OR "ca blocks" ) AND (WARNING OR ERROR) 
 earliest=-90d@d latest=now  
 | rename comment AS "1.) ------------------------- SPL Search string based on the expert domain knowledge --------------------------"   

 | rex field=_raw "^(?<firstLine>.*)\n.(?<remainingLines>[\s\S]*)$"
 | eval text=replace(remainingLines,"\d{0}\d+","")
 |rename comment AS "2.) ------------------------- Pre-parsing: get rid of first line with date / time and the digits ---------------"

 | anomalies field=text
 | rename comment AS "3.) ------------------------- Anomalies -----------------------------------------------------------------------"

 | table _time _raw text unexpectedness
 | sort 3 -unexpectedness
 | cluster field=text
 | eval time_ago_24 = relative_time(now(), "-24h")
 | where _time > time_ago_24

View solution in original post

0 Karma

chrisyounger
SplunkTrust
SplunkTrust

You can use the relative_time function in eval like so:

 index=mlbso sourcetype=ISP_abaptraces ( mtx OR mmx OR mm_diagmode OR sigigenaction OR thierrhandle OR mutex OR "ca blocks" ) AND (WARNING OR ERROR) 
 earliest=-90d@d latest=now  
 | rename comment AS "1.) ------------------------- SPL Search string based on the expert domain knowledge --------------------------"   

 | rex field=_raw "^(?<firstLine>.*)\n.(?<remainingLines>[\s\S]*)$"
 | eval text=replace(remainingLines,"\d{0}\d+","")
 |rename comment AS "2.) ------------------------- Pre-parsing: get rid of first line with date / time and the digits ---------------"

 | anomalies field=text
 | rename comment AS "3.) ------------------------- Anomalies -----------------------------------------------------------------------"

 | table _time _raw text unexpectedness
 | sort 3 -unexpectedness
 | cluster field=text
 | eval time_ago_24 = relative_time(now(), "-24h")
 | where _time > time_ago_24
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...