Getting Data In

trigger a script on a Forwarder from Indexer on another host?

jhedgpeth
Path Finder

I've got a single v4 Splunk Indexer/Search. Feeding it are multiple Forwarders that have local indexing disabled and have all their inputs/conf deployed to them in apps. I'd like for a scheduled search on the Indexer to be able to trigger a script I've deployed on the forwarder.

My limitations are that I don't want the script to run and send data continuously (which would be easy) because 99% of the time it's not relevant, and we're trying to isolate a problem that occurs sporadically. Also, our environment doesn't allow password-less (automated) logins from host-to-host, so the simple ssh solution isn't available.

There's doesn't appear to be anything about this in help/answers, but I may be searching for the wrong thing. Can Splunk do this? Am I going about this the wrong way? Thanks.

Tags (3)
0 Karma
1 Solution

jrodman
Splunk Employee
Splunk Employee

Hm, Splunk is not generally designed to have such a channel.

You could perhaps abuse the distributed search functionality to cause a distributed search to trigger a 'command' implemented by a script, which actually updates a log which the nodes monitor. Or something like that. It's almost certainly more work than you'd want, and quite rube-goldberg.

Personally I'd have the input scripts hit a url and if they get a successful response of say, an integer, and that integer is updated, produce the output. Then you can fairly easily update such a number on an http of your choosing or by modifying a static asset in a splunkweb directory.

Ie, personally I'd make the trigger mechanism be external to Splunk.

View solution in original post

0 Karma

jrodman
Splunk Employee
Splunk Employee

Hm, Splunk is not generally designed to have such a channel.

You could perhaps abuse the distributed search functionality to cause a distributed search to trigger a 'command' implemented by a script, which actually updates a log which the nodes monitor. Or something like that. It's almost certainly more work than you'd want, and quite rube-goldberg.

Personally I'd have the input scripts hit a url and if they get a successful response of say, an integer, and that integer is updated, produce the output. Then you can fairly easily update such a number on an http of your choosing or by modifying a static asset in a splunkweb directory.

Ie, personally I'd make the trigger mechanism be external to Splunk.

0 Karma

jhedgpeth
Path Finder

Thanks, that's a good idea given my restrictions. I have a common NAS share that should serve as a means of communication between the two pieces, and I'm already thinking of other uses for "conditional scripts".

0 Karma
Get Updates on the Splunk Community!

Splunk Enterprise Security: Your Command Center for PCI DSS Compliance

Every security professional knows the drill. The PCI DSS audit is approaching, and suddenly everyone's asking ...

Developer Spotlight with Guilhem Marchand

From Splunk Engineer to Founder: The Journey Behind TrackMe    After spending over 12 years working full time ...

Cisco Catalyst Center Meets Splunk ITSI: From 'Payments Are Down' to Root Cause in ...

The Problem: When Networks and Services Don't Talk Payment systems fail at a retail location. Customers are ...