Deployment Architecture
Highlighted

SHC - push apps without deployer

Builder

Hi,

I want datasets addon onto my search heads of my cluster. Port 8089 is not open between the deployer and SHC search heads. Is there any way to push the add-on manually bypassing the deployer? It's bit urgent and we cant wait until the port is open.

If not, any other easy way to save my search as a dataset table?

Thanks,
Naresh

0 Karma
Highlighted

Re: SHC - push apps without deployer

Super Champion

Personally, i also don't like deployer, but that's the only way to ensure bundles are applied to SHC. Please note, it will be a nightmare if you do these things manually especially the next time apps push even after you "enable" the port 8089 between deployer & SHC.

The other things which I could suggest are
1. is the deployer in same box/server as in "cluster master" ? If yes, you could re-use cluster-master as deployer for time being unless you are not restricted by resources etc.
2. Try making one of the server (spare server) as a standalone Search Head if it is urgent. You can do the addon using deployment-server for standalone SH. Please note this is a temporary workaround.
3. Manually add the configuration items (like props.conf, eventtypes.conf etc) directly via GUI as a temporary measure on the SHC. This way you can do your logic as a workaround until you open the firewall ports

I couldn't get what you meant by "dataset table". IMO, you don't need any addon to put into a table

0 Karma
Highlighted

Re: SHC - push apps without deployer

Builder

Thanks for your points @koshyk

Yes, my deployer is cluster master. So, in this case, how can I push apps if my port 8089 is blocked? irrespective of it being cluster master / deployer.

1. is the deployer in same box/server as in "cluster master" ? If yes, you could re-use cluster-master as deployer for time being unless you are not restricted by resources etc.

dataset table::::: I install "Splunk Datasets Add-on" on search heads so that I can create tables as I need instead of running a base search followed by a table command with many columns which I need. I am somehow used to this approach.

And I use these tables in my reports / dashboards using "|datamodel: "

Now, I am saving my whole search and required columns using table command and using "|savedsearch " in my dashboards/reports until my port is open.

0 Karma