- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Run lookup before streaming command (anomaly)
Hello Team,
I need to run anomaly command on the top of results returned by the lookup.
My lookup is geo: enriching my events with Country and City based on IP address. Then i need to use Country and City as input for anomaly command. But that does not work, because of streaming.
this is working fine:
index=mydata | rename public_ip as ip | lookup maxmind_secone_lookup ip OUTPUT country, region, city
For this:
index=mydata | rename public_ip as ip | lookup maxmind_secone_lookup ip OUTPUT country, region, city | anomalies threshold=0.03 by field1, field2, country
I am getting error:
Streamed search execute failed because: Error in 'lookup' command: Script execution failed for external search command '/opt/splunk/var/run/searchpeers/splunk-1742240975/apps/myapp/bin/geoip_max_lookup.py'..
How to fix it ? I need to run anomalies based on geo enriched country.
Thanks,
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Check the file permissions and ensure the script is executable. Run this on the Splunk server:
ls -l /opt/splunk/var/run/searchpeers/splunk-1742240975/apps/myapp/bin/geoip_max_lookup.py
If it’s not executable, make it so:
chmod +x /opt/splunk/var/run/searchpeers/splunk-1742240975/apps/myapp/bin/geoip_max_lookup.py
ensure the file is owned by the user running Splunk (often splunk):
chown splunk:splunk /opt/splunk/var/run/searchpeers/splunk-1742240975/apps/myapp/bin/geoip_max_lookup.py
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

@kiran_panchavatHave you actually read the question? @MichalG1 explicitly wrote that the lookup does work when it's the last component in the search. If its results are further processed down the search pipeline it doesn't.
@MichalG1Have you compared search.log from both those cases? It is indeed a bit strange that it does/doesn't work this way.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @PickleRick
Thanks for the help - again 🙂
I do believe it's not working because of this: "Streamed search execute failed"
The reason is: lookup output is streaming and it can not be the input for anomaly command.
In the search logs i can see:
03-18-2025 17:14:39.482 INFO SearchPhaseGenerator [1439947 searchOrchestrator] - Optimized Search =| search (userid!=null earliest=-1d index=data sourcetype=mydata) | where match(ip,"^\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}$") | lookup maxmind_lookup ip OUTPUT country, region, city | anomalies threshold=0.0001 by field1, field2, ip 03-18-2025 17:14:39.482 INFO ScopedTimer [1439947 searchOrchestrator] - search.optimize 0.001056652 03-18-2025 17:14:39.482 INFO FederatedInfo [1439947 searchOrchestrator] - No federated search providers defined. 03-18-2025 17:14:39.482 INFO PhaseNodeGenerationVisitor [1439947 searchOrchestrator] - FallBackReason: Fallback to 2-phase mode because of empty split key of cmd: anomalies
The search fails immediately, its not even really executed.
I've tried with CHAT GPT to change the output from lookup for it to be in non-streaming format, but failed (it's not trivial since my lookup is external, not csv file).
Still trying to find the right query.
Thanks,
Michal
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

You can "destream" your pipeline by inserting the table command before your lookup (either with a strict set of files or just wildcard).
