In a distributed search environment, do the python files for scripted lookups have to be placed manually on the indexers?
Or is it ok to just place the files in the search head, and allow it to replicate the scripts onto the indexers?
The lookup scripts will be included in the bundles that are replicated to the search peers, so you won't have to copy those manually.
Just make sure you keep track of where the lookup is run - the search head will be ultimately responsible for bringing together the search results at some point in the search, so if you run a lookup after that point it'll be the search head running it, not the search peers.
The lookup scripts will be included in the bundles that are replicated to the search peers, so you won't have to copy those manually.
Just make sure you keep track of where the lookup is run - the search head will be ultimately responsible for bringing together the search results at some point in the search, so if you run a lookup after that point it'll be the search head running it, not the search peers.
I am getting "Read TimeoutError" for scripted lookup like below:
Problem replicating config (bundle) to search peer ' <indexerip>:8089 ', Upload bundle="/opt/splunk/var/run/7CAA9ED1-1C47-433B-83A2-5FBC5B25691E-1593006032.bundle" to peer name=c1idx3taf.cdsys.local uri=https://<indexerip>:8089 failed; error="Read Timeout".
Am I doing something wrong?
Thanks, placing the script on the search head worked perfectly