Splunk Search

What is a good way to match records against a secondary source, say a lookup file? (more than 10k rows)

loganramirez
Path Finder

Hi.

Got some great help using subsearches to match against a directory (CSV or SQL) using a sub search (https://community.splunk.com/t5/Splunk-Search/What-is-the-fastest-way-to-use-a-lookup-or-match-recor...), however, in some cases, it could be hundreds of thousands of records so I'm hitting this 10k limit (sub search limit).

So the question is

What is a good way to match records against a secondary source, say a lookup file?

I'm able to use lookup, of course, but feels like there might be a better way.

To use a specific example, I have an index that has a phone number.

I have a CSV file that puts those phone numbers in lists (like a directory).

If want 'all numbers in TEST directory' and I know I'll have more than 10k rows, then I do this:

 

 

index=myindex more-criteria-to-try-and-reduce
| lookup directory.csv number  output list
| search list="TEST"

 

 

While this works, it obviously runs pretty long and so just asking if a better way!

Thank you!

 

 

Labels (1)
0 Karma
1 Solution

isoutamo
SplunkTrust
SplunkTrust

isoutamo
SplunkTrust
SplunkTrust

Hi

you should look kvstore for this amount of data. Of course lookups are option, but it depends which one is better.

r. Ismo

Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

Watch On Demand the Tech Talk, and empower your SOC to reach new heights! Duration: 1 hour  Prepare to ...

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...