Splunk Search

Subsearches compairing datasets

adamsmith47
Communicator

Hello all,

I have a search technique I've been using to compare smaller sets of data, to find the difference, however I'm running into the subsearch limit with a new set of data. I'm hoping someone has a good idea for a different way to perform the search that doesn't run into subsearch limits. Here's the situation:

Each night a system is dumping an *.csv log into a directory which Splunk is monitoring and indexing. The csv is approximately 50k lines, therefor approx 50k events indexed by Splunk. I'm being asked to report each morning on events that exist in today's dump, which didn't exist in the previous day's dump. I've gone to my typical routine below in an attempt to accomplish this, but I'm hitting that 10k subsearch limit. I'm assuming I could up the limit, but, I'd rather have a more efficient search, if possible.

| set union
[search index=<index> sourcetype=<sourcetype> earliest=@d-1d latest=@d | eval daysago=1 | stats count by <field1> <field2> <field3> daysago | fields - count]
[search index=<index> sourcetype=<sourcetype> earliest=@d latest=@d+1d | eval daysago=0 | stats count by <field1> <field2> <field3> daysago | fields - count]
| stats max(daysago) as daysago by <field1> <field2> <field3> | where daysago=0
| eval Details="Has been added in the past day."
| table Details <field1> <field2> <field3>

I know the logic is sound (I use it for other things), but here the subsearches are just too big.

Any advice is welcome! Thank you.

0 Karma
1 Solution

lguinn2
Legend

You can only up the limit to 10,499 so that isn't going to help. The following technique has no limits and will run much faster:

search index=<index> sourcetype=<sourcetype> earliest=-1d@d
| eval daysago=if(_time>reltime(now,"@d"),daysago=0,daysago=1)
| stats count by <field1> <field2> <field3> daysago 
| fields - count
| stats max(daysago) as daysago by <field1> <field2> <field3> 
| where daysago=0
| eval Details="Has been added in the past day."
| table Details <field1> <field2> <field3>

This technique searches the data set only once, then categorizes the results before comparing them.

View solution in original post

DalJeanis
SplunkTrust
SplunkTrust

How about this -

 index=<index> sourcetype=<sourcetype> earliest=-1d@d
| bin _time span=1d
| stats min(_time) as mintime max(_time) as maxtime by <field1> <field2> <field3>
| eventstats min(mintime) as yesterdayepoch max(maxtime) as todayepoch
| where mintime=maxtime
| eval myflag=case(mintime==todayepoch,"Added Record",maxtime==yesterdayepoch,"Deleted Record", true(),"Nonesuch Record")  
| eval _time = mintime 
| table _time <field1> <field2> <field3> myflag

updated case tests to use == rather than =

0 Karma

lguinn2
Legend

You can only up the limit to 10,499 so that isn't going to help. The following technique has no limits and will run much faster:

search index=<index> sourcetype=<sourcetype> earliest=-1d@d
| eval daysago=if(_time>reltime(now,"@d"),daysago=0,daysago=1)
| stats count by <field1> <field2> <field3> daysago 
| fields - count
| stats max(daysago) as daysago by <field1> <field2> <field3> 
| where daysago=0
| eval Details="Has been added in the past day."
| table Details <field1> <field2> <field3>

This technique searches the data set only once, then categorizes the results before comparing them.

somesoni2
SplunkTrust
SplunkTrust

I believe we can eliminate first stats altogether (| stats count...). Also, he earliest for 0 daysago i.e. @d is inclusive of events exactly at @d, the comparison operator for _time>relative_time(now(),"@d") (there is a typo in the relative_time) should be >=.

adamsmith47
Communicator

Thank you Ignuinn and somesoni, it's working well!

The form I've ultimately gone with is:

index=<index> sourcetype=<sourcetype> earliest=-1d@d
| eval daysago=if(_time>=relative_time(now(),"@d"),0,1)
| stats max(daysago) as daysago by <field1> <field2> <field3>
| where daysago=0
| eval Details="Has been added in the past day."
| table Details <field1> <field2> <field3>
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...