Splunk Enterprise

The lookup command with WILDCARD of a large file doesn't work

tommyJ
Observer

When I checked the lookup command with "WILDCARD", the command doesn't work if the file size becomes large.

Does anyone know some related settings or something? I'm using splunk version "8.2.2.1".

The situation is shown below.

  • transforms.conf

 

 

 

[lookup_test]
batch_index_query = 1
case_sensitive_match = 0
filename = lookup_test.csv
match_type = WILDCARD(field)
max_matches = 1

 

 

 

  • lookup_test.csv

 

 

field
"*.example.com"
"*.example.com"
.
.
.

 

 

(I used the same word repeatedly for checking.)

  • search query

I just want to match domain with "WILDCARD"

 

 

 

| makeresults annotate=true
| eval _raw="domain
  aa.example.com"
| multikv forceheader=1
| table domain
| lookup lookup_test field as domain OUTPUTNEW field as field_result

 

 

 So the expected result is below

domainfield_result
aa.example.com

*.example.com

 

If the "lookup_test.csv" is 620,000 lines(file size is about 9.5MB), the WILDCARD match works fine.

But if the "lookup_test.csv" is 630,000 lines(file size is about 9.7MB), the WILDCARD match doesn't work. I mean the "field_result" value becomes blank. And it only match "EXACT".

domainfield_result
aa.example.com

 

*.example.com

*.example.com

 

I also tried other words(ex. "*.aexample.com"). If the lookup file size becomes larger than about 9.5MB, the "WILDCARD" match doesn't work and only match "EXACT".

So I think this related to lookup file size, but I couldn't find any documents.

0 Karma
Get Updates on the Splunk Community!

Celebrating Fast Lane: 2025 Authorized Learning Partner of the Year

At .conf25, Splunk proudly recognized Fast Lane as the 2025 Authorized Learning Partner of the Year. This ...

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...