Splunk Search

How to fix Error inputlookup?

abazgwa21cz
Explorer

I have an issues with lookup, i create a table 

abazgwa21cz_0-1675150049307.png

I want to exclude path in lookup table from my search, so i try this query : 

index="kaspersky" AND etdn="Object not disinfected" p2 NOT ([ inputlookup FP_malware.csv]) | eval time=strftime(_time,"%Y-%m-%d %H:%M:%S")|stats count by time hip hdn etdn p2 | dedup p2

abazgwa21cz_1-1675150128071.png

it seems not working . So how can i fix this ?????
Many thanks !!

 

Labels (2)
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

Your basic problem is that your lookup is

FP_Malware.csv 

and your lookup in the search is

FP_malware.csv 

upper/lower case. 

However, you do not need 

p2 NOT ...

Just use 

NOT [ | inputlookup ... ]

The response coming back from the subsearch will be p2=x OR p2=y OR p2=z

You can see the format of the subsearch response by doing

| inputlookup FP_malware.csv | format ]

 

View solution in original post

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @abazgwa21cz,

subsearches require that you explicit the fields to use as kay, and they must be the same of the main search.

In other words, if lookup_path is the path in the lookup and path is the field in the search, 

index="kaspersky" AND etdn="Object not disinfected" p2 NOT [ | inputlookup FP_malware.csv | rename lookup_path AS path | fields path ] 
| eval time=strftime(_time,"%Y-%m-%d %H:%M:%S")
| stats count BY time hip hdn etdn p2 
| dedup p2

then the pipe before the inputlookup command is missing.

At least, in the stats command, why did you use many fields in the BY clause and then dedup, why don't you used directly only p2 in the BY clause.

Ciao.

Giuseppe

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Your basic problem is that your lookup is

FP_Malware.csv 

and your lookup in the search is

FP_malware.csv 

upper/lower case. 

However, you do not need 

p2 NOT ...

Just use 

NOT [ | inputlookup ... ]

The response coming back from the subsearch will be p2=x OR p2=y OR p2=z

You can see the format of the subsearch response by doing

| inputlookup FP_malware.csv | format ]

 

0 Karma

abazgwa21cz
Explorer

my mistake . thanks alot it work now 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @abazgwa21cz ,

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated by all the contributors 😉

0 Karma

abazgwa21cz
Explorer

Thanks alot , i have one more questions ,  

I just install misp42 app in my splunk , and add misp instance to splunk , it work 

abazgwa21cz_0-1698050821170.png

 

 

But i want compare from : index=firewall srcip=10.x.x.x , it my log from firewall , so i want compare dstip with ip-dst from misp to detect unusual access activities  , like when dstip=ip-dst : 152.67.251.30 , how can i search this  , misp_instance=IP_Block field=value , i just try some search but it not work: 

index=firewall srcip=10.x.x.x 
| mispsearch misp_instance=IP_Block field=value
| search dstip=ip=dst
| table _time dstip ip-dst value action



It can't get ip-dst from misp instance ,

Can you help me with this OR can i get some solution to resolve this 

Many thanks and Best regards !!

0 Karma

bowesmana
SplunkTrust
SplunkTrust

@abazgwa21cz For a new question, please ask it in a new topic, so that any answers relate to the new question.

 

0 Karma

abazgwa21cz
Explorer

I did , but no solution receive , Can u help me pls : 
https://community.splunk.com/t5/Splunk-Search/Error-Search/m-p/665820#M228449

0 Karma
Get Updates on the Splunk Community!

Industry Solutions for Supply Chain and OT, Amazon Use Cases, Plus More New Articles ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Enterprise Security Content Update (ESCU) | New Releases

In November, the Splunk Threat Research Team had one release of new security content via the Enterprise ...

Index This | Divide 100 by half. What do you get?

November 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...