Splunk Search

Attempt to workaround 10k subsearch limit -- how to combine multiple lookup files?

the_wolverine
Champion

I'm breaking up my search and outputting the results into separate files. How can I combine these files into a single file once I'm done? .. Using Splunk UI, of course 😉

Example would be something like: | inputlookup lookupfile1.csv lookupfile2.csv lookupfile3.csv | dedup fieldname

1 Solution

ziegfried
Influencer

The append flag of the inputlookup command to the rescue (inputcsv supports it as well)

| inputlookup lookupfile1.csv append=1 | inputlookup lookupfile2.csv append=1 | inputlookup lookupfile2.csv append=1 | dedup fieldname

View solution in original post

ziegfried
Influencer

The append flag of the inputlookup command to the rescue (inputcsv supports it as well)

| inputlookup lookupfile1.csv append=1 | inputlookup lookupfile2.csv append=1 | inputlookup lookupfile2.csv append=1 | dedup fieldname

Ayn
Legend

It would be interesting to hear more about why the 10000 limit is there in the first place. I've encountered it loads of time by now (using inputlookup for loading blacklists of IP's that should be matched for instance) and it's of course frustrating to be limited like this. The question is if there's a very good reason for this limit to exist and have this specific value, like for instance that past 10000 search terms you won't increase performance with any additional terms?

Get Updates on the Splunk Community!

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

Splunk Decoded: Business Transactions vs Business IQ

It’s the morning of Black Friday, and your e-commerce site is handling 10x normal traffic. Orders are flowing, ...

Fastest way to demo Observability

I’ve been having a lot of fun learning about Kubernetes and Observability. I set myself an interesting ...