Splunk Search

How to run a subearch within a specified time limit?

santhosh2kece
Engager

Hi,

I am running the below search query and get the error "[subsearch]: Subsearches of a real-time search run over all-time unless explicit time bounds are specified within the subsearch"

index="webproxylogs" [|inputlookup Blacklist_URLs.csv | rename Malicious_URL as cs_host | fields + cs_host |dedup cs_host |fields cs_host] NOT [|inputlookup Whitelist_URLs.csv | rename Non-Malicious_URL as cs_host | fields + cs_host |dedup cs_host |fields cs_host]

In my search, I am trying to get the list of internal hosts accessing the domains listed in blacklist_urls.csv and excluding the whitelist domains (like google, yahoo,etc.,) listed in Whitelist_URLs.csv . If the well known domains like google.com are accidentally added to blacklist_url.csv, they get excluded by the whitelist_urls.csv.

The above search query was giving me results, however in the recent past (last 45 days) this search query gives me the subseach time limit error mentioned above. Please help me to rectify this issue. Thanks

Tags (3)
0 Karma

MuS
SplunkTrust
SplunkTrust

Hi santhosh2kece,

That's only one limitation you can get if you're using subsearches.
I would simply setup an automatic lookup http://docs.splunk.com/Documentation/Splunk/6.2.0/Knowledge/Usefieldlookupstoaddinformationtoyoureve... to set the a new field called Malicious_URL= to yes for the bad URLs and Malicious_URL=no for the whitelisted ones.

This way you can search pretty easy for this:

index="webproxylogs" Malicious_URL="yes"

and all is good, no more problems related to any subsearch limits.

hope this helps ...

cheers, MuS

0 Karma

santhosh2kece
Engager

MuS,

I added the blacklisted URL.csv to Automatic lookups "Malicious_URL AS 1 Non_Malicious_Url AS 0 OUTPUTNEW". However when I give the search query

index="webproxylogs" Malicious_URL="yes"
I receive the following error Error 'Could not find all of the specified lookup fields in the lookup table.' for conf 'source::tcp:9998|host::nyc-proxy-2.bfm.com|bcoat_proxysg' and lookup table 'FSISAC_Malicious'.

Also in my query I used two csv files 1. blacklist_url.csv and 2. whitelist_url.csv. please let me know whether whitelist_url.csv should be ignored.

0 Karma

MuS
SplunkTrust
SplunkTrust

regarding the error, either you provided not enough or too many fields for the lookup.
Since I don't know your exact use case, I cannot answer this for you. Check if the use of the black-list is sufficient for you, if not setup a second auto lookup using the white-list as well.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...