Splunk Search

Search if a field is NOT in the results of a subsearch

OldManEd
Builder

This question is a follow-up to one I've submitted previously, "Search if a field is in the results of a subsearch". If my search looks like this now;

 index=my_index field1=abc field2=def 
  field3!=aaa
  field3!=bbb
  field3!=ccc
  field3!=ddd
  field3!=eee
  field3!=fff

Is there a way to use an inputlookup table search to rewrite the search? I've tried using NOT in front of the inputlookup search, but that didn't work.

index=my_index field1=abc field2=def  [ search  NOT | inputlookup <filename.csv> | fields <my_field3> | return 100 field3]

I also tried the Splunk IN() function, but that didn't work either. It appears that the data returned from a inputlookup is in a "OR"ed format where I need an "AND" format, and I can't figure out how to do that.

Any help will be appreciated.

0 Karma
1 Solution

nryabykh
Path Finder

Just place NOT before the subsearch:

index=my_index field1=abc field2=def NOT [ |inputlookup filename.csv | fields field3 ]

View solution in original post

nryabykh
Path Finder

Just place NOT before the subsearch:

index=my_index field1=abc field2=def NOT [ |inputlookup filename.csv | fields field3 ]

OldManEd
Builder

nryabykh,
This doesn't seem to work for me. I altered the search as follows;

  index=my_index field1=abc field2=def  field3=*

   field3!=aaa
   field3!=bbb
   field3!=ccc
   field3!=ddd
   field3!=eee
   field3!=fff

And then ran the search. The "Selected Fields" showed that the specific field3 entries in the search were filtered out. I then ran this search;

    index=my_index field1=abc field2=def  field3=*
   NOT  [ |inputlookup filename.csv | fields field3 ]

And the "Selected Fields" entry showed that nothing was filtered out.

~Ed

0 Karma

nryabykh
Path Finder

Mmm, do you have lookup filename.csv with field field3? My subsearch is only an example, you should use your filename and fieldname.

0 Karma

OldManEd
Builder

Understood. But it appears that the "NOT" section of the search is only keying on the first entry in the lookup table. As an example, I altered the search to look like what I have below;

  index=my_index field1=abc field2=def 
   field3!=aaa

The results did not contain any events where field3 equaled "aaa". Then I altered it again with the search below;

     index=my_index field1=abc field2=def 
    NOT  [ |inputlookup filename.csv | fields field3 ]

I got the exact same results from both - no events with "aaa" but I could see events with "bbb", "ccc", etc.

The "field3" column in the "Filename.csv" file looks like this;

    field3
    aaa
    bbb
    ccc
    ddd
    eee
    fff

Like I said, it appears that the inputlookup is seeing the first entry and then stops.

Any ideas?

0 Karma

nryabykh
Path Finder

It seems subsearch should be rounded with brackets. I thought Splunk does this automatically, but I likely mistaked.
index=my_index field1=abc field2=def NOT ([ |inputlookup filename.csv | fields field3 ])

0 Karma

OldManEd
Builder

nryabykh,
Nope. I added the parenthesis and it still did not work. I got the same number of returns with some of the data that should have been filtered.

~Ed

0 Karma

nryabykh
Path Finder

I've run out of ideas, sorry 😞
I tried to implement this, and it seems it works like a charm (link to screenshot: https://photos.app.goo.gl/Oo4WSGJLrbAAyHXk2 ).
I don't know, maybe it depends on version of Splunk (I use 7.0.2).

0 Karma

OldManEd
Builder

nryabykh,

OMG! I found my problem. ~Operator Error~ After banging my head against a wall on this issue I took a good look at it. In the CSV file I had this;

field3,
"""aaa""",
"""bbb""",
"""ccc""",
"""ddd""",
"""eee""",
"""fff""",

When I loaded the lookup, the actual data the search was trying to match to had double quotes in the string, (It was trying to match "aaa", not just aaa.). I removed all the quotes in the CSV file and everything worked.

Sorry I drug you into this one but thanks a ton for your help. I appreciate it.

~Ed

0 Karma

nryabykh
Path Finder

All right, I'm glad we cleared this up 🙂

0 Karma
Get Updates on the Splunk Community!

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...