Splunk Dev

Search Query Optimizing

kunalsingh
Engager

Please help me to Optimize this Splunk Query

index:: rasp_

NOT [inputlookup Scanners_Ext.csv | fields forwarded_for]

NOT [inputlookup Scanners_Int.csv | rename ip_addr AS forwarded_for | fields forwarded_for]

NOT [inputlookup vz_nets.csv | rename netblock AS forwarded_for | fields forwarded_for]

NOT (forwarded_for="140.108.26.152" OR forwarded_for="" OR forwarded_for="10.*" OR forwarded_for=null) app!="" app!="\"*\"" app!="VASTID*" host!="10.215*" host!="ip-10-*" host!="carogngsa*" host!="carogngta*" host!="carofuzedd** host!="*ebiz*" host!="echo*" host!="not logged" host!="onm*" host!="tfnm*" host!="voip*" host!="wfm*" category!="Config*" category!="Depend*" category!="Stat*" category!="Large*" category!="Uncaught*" category!="Unvalidated Redirect" category!="License" category!="*Parse*" action=*

| stats count
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

Optimising this will depend on your data. Using subsearches with lookups can be expensive and using NOT with subsearches, even more so.

Depending on the volume of entries in those lookups you will be better off using a lookup, e.g.

index:: rasp_ NOT (
    forwarded_for="140.108.26.152" OR 
    forwarded_for="" OR 
    forwarded_for="10.*" OR 
    forwarded_for=null)
    app!="" app!="\"*\"" app!="VASTID*" 
    host!="10.215*" host!="ip-10-*" host!="carogngsa*" host!="carogngta*" host!="carofuzedd** host!="*ebiz*" host!="echo*" host!="not logged" host!="onm*" host!="tfnm*" host!="voip*" host!="wfm*"
    category!="Config*" category!="Depend*" category!="Stat*" category!="Large*" category!="Uncaught*" category!="Unvalidated Redirect" category!="License" category!="*Parse*" action=*

| lookup Scanners_Ext.csv forwarded_for OUTPUT forwarded_for as found
| where isnull(found)
| lookup Scanners_Int.csv ip_addr as forwarded_for OUTPUT ip_addr as found 
| where isnull(found)
| lookup vz_nets.csv netblock as forwarded_for OUTPUT netblock as found
| where isnull(found)

| stats count

so the static NOT statement and other != comparisons is part of the search and then you do each lookup in turn and if it's found then it will be discarded.

The order of the 3 lookups would be in likely match count order, so the first lookup should be done that would be expected to reduce the event count by the max, and so on.

Using NOT or all your != wildcard searches at the beginning will be somewhat expensive, you can use TERM() to reduce data scan count, but that requires knowing your data well.

 

View solution in original post

bowesmana
SplunkTrust
SplunkTrust

Optimising this will depend on your data. Using subsearches with lookups can be expensive and using NOT with subsearches, even more so.

Depending on the volume of entries in those lookups you will be better off using a lookup, e.g.

index:: rasp_ NOT (
    forwarded_for="140.108.26.152" OR 
    forwarded_for="" OR 
    forwarded_for="10.*" OR 
    forwarded_for=null)
    app!="" app!="\"*\"" app!="VASTID*" 
    host!="10.215*" host!="ip-10-*" host!="carogngsa*" host!="carogngta*" host!="carofuzedd** host!="*ebiz*" host!="echo*" host!="not logged" host!="onm*" host!="tfnm*" host!="voip*" host!="wfm*"
    category!="Config*" category!="Depend*" category!="Stat*" category!="Large*" category!="Uncaught*" category!="Unvalidated Redirect" category!="License" category!="*Parse*" action=*

| lookup Scanners_Ext.csv forwarded_for OUTPUT forwarded_for as found
| where isnull(found)
| lookup Scanners_Int.csv ip_addr as forwarded_for OUTPUT ip_addr as found 
| where isnull(found)
| lookup vz_nets.csv netblock as forwarded_for OUTPUT netblock as found
| where isnull(found)

| stats count

so the static NOT statement and other != comparisons is part of the search and then you do each lookup in turn and if it's found then it will be discarded.

The order of the 3 lookups would be in likely match count order, so the first lookup should be done that would be expected to reduce the event count by the max, and so on.

Using NOT or all your != wildcard searches at the beginning will be somewhat expensive, you can use TERM() to reduce data scan count, but that requires knowing your data well.

 

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...