Deployment Architecture

creating a lookup with search-head pooling

a212830
Champion

Hi,

We just implemented search-head pooling. I have a lookup that gets generated via wget. How do people handle situations like this? I don't want to tie the scripts to a specific server, but I also want to make sure it runs when one of the servers is down. Is there a way to have Splunk run it, but not generate indexed data?

0 Karma

ewoo
Splunk Employee
Splunk Employee

One option: create a Python search command that updates your lookup. Then, schedule a search that invokes that search command on a desired interval (e.g. every night at midnight). The search heads in your pool will coordinate such that only one instance runs the scheduled search "at a time" (i.e. only one instance in the pool will run the scheduled search, per interval).

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...