Getting Data In

trigger a script on a Forwarder from Indexer on another host?

jhedgpeth
Path Finder

I've got a single v4 Splunk Indexer/Search. Feeding it are multiple Forwarders that have local indexing disabled and have all their inputs/conf deployed to them in apps. I'd like for a scheduled search on the Indexer to be able to trigger a script I've deployed on the forwarder.

My limitations are that I don't want the script to run and send data continuously (which would be easy) because 99% of the time it's not relevant, and we're trying to isolate a problem that occurs sporadically. Also, our environment doesn't allow password-less (automated) logins from host-to-host, so the simple ssh solution isn't available.

There's doesn't appear to be anything about this in help/answers, but I may be searching for the wrong thing. Can Splunk do this? Am I going about this the wrong way? Thanks.

Tags (3)
0 Karma
1 Solution

jrodman
Splunk Employee
Splunk Employee

Hm, Splunk is not generally designed to have such a channel.

You could perhaps abuse the distributed search functionality to cause a distributed search to trigger a 'command' implemented by a script, which actually updates a log which the nodes monitor. Or something like that. It's almost certainly more work than you'd want, and quite rube-goldberg.

Personally I'd have the input scripts hit a url and if they get a successful response of say, an integer, and that integer is updated, produce the output. Then you can fairly easily update such a number on an http of your choosing or by modifying a static asset in a splunkweb directory.

Ie, personally I'd make the trigger mechanism be external to Splunk.

View solution in original post

0 Karma

jrodman
Splunk Employee
Splunk Employee

Hm, Splunk is not generally designed to have such a channel.

You could perhaps abuse the distributed search functionality to cause a distributed search to trigger a 'command' implemented by a script, which actually updates a log which the nodes monitor. Or something like that. It's almost certainly more work than you'd want, and quite rube-goldberg.

Personally I'd have the input scripts hit a url and if they get a successful response of say, an integer, and that integer is updated, produce the output. Then you can fairly easily update such a number on an http of your choosing or by modifying a static asset in a splunkweb directory.

Ie, personally I'd make the trigger mechanism be external to Splunk.

0 Karma

jhedgpeth
Path Finder

Thanks, that's a good idea given my restrictions. I have a common NAS share that should serve as a means of communication between the two pieces, and I'm already thinking of other uses for "conditional scripts".

0 Karma
Get Updates on the Splunk Community!

Why You Can't Miss .conf25: Unleashing the Power of Agentic AI with Splunk & Cisco

The Defining Technology Movement of Our Lifetime The advent of agentic AI is arguably the defining technology ...

Deep Dive into Federated Analytics: Unlocking the Full Power of Your Security Data

In today’s complex digital landscape, security teams face increasing pressure to protect sprawling data across ...

Your summer travels continue with new course releases

Summer in the Northern hemisphere is in full swing, and is often a time to travel and explore. If your summer ...