Splunk Search

What is a good way to match records against a secondary source, say a lookup file? (more than 10k rows)

loganramirez
Path Finder

Hi.

Got some great help using subsearches to match against a directory (CSV or SQL) using a sub search (https://community.splunk.com/t5/Splunk-Search/What-is-the-fastest-way-to-use-a-lookup-or-match-recor...), however, in some cases, it could be hundreds of thousands of records so I'm hitting this 10k limit (sub search limit).

So the question is

What is a good way to match records against a secondary source, say a lookup file?

I'm able to use lookup, of course, but feels like there might be a better way.

To use a specific example, I have an index that has a phone number.

I have a CSV file that puts those phone numbers in lists (like a directory).

If want 'all numbers in TEST directory' and I know I'll have more than 10k rows, then I do this:

 

 

index=myindex more-criteria-to-try-and-reduce
| lookup directory.csv number  output list
| search list="TEST"

 

 

While this works, it obviously runs pretty long and so just asking if a better way!

Thank you!

 

 

Labels (1)
0 Karma
1 Solution

isoutamo
SplunkTrust
SplunkTrust

isoutamo
SplunkTrust
SplunkTrust

Hi

you should look kvstore for this amount of data. Of course lookups are option, but it depends which one is better.

r. Ismo

Get Updates on the Splunk Community!

Index This | Why did the turkey cross the road?

November 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Feel the Splunk Love: Real Stories from Real Customers

Hello Splunk Community,    What’s the best part of hearing how our customers use Splunk? Easy: the positive ...