Deployment Architecture

Proper way of pushing Splunk apps from dev environment to production environment


Hi Experts,

I want to know the proper way of pushing the Splunk apps from dev environment to production environment. As of now, I am using Deployer for pushing the apps from dev to prod SH clusters. But, I think, ideally there should be an automated way of doing this using some devops tools ? My current procedure is manual and I want to automate this :

  1. Copy the modified views/lookups from dev environment to deployer manually, and then from, deployer push the cluster bundle to search heads.
  2. Everytime, if there are any minor changes also in dashboards, lookups, alerts etc. I have to manually copy each files to deployer and then push it to search head cluster.

Please let me know how can I automate this and whether we can use some tools to do this ?


0 Karma


There can be diff ways and use of tools [ ansible, chef, scripting, etc..]

one simple way would be [ assuming your dev server is connected to a source control like GitHub], to ensure you have branch for your prod code and baseline the changes. Do the dev in dev server and create a tarball of all the apps and move that to your staging/test server, where you can deploy and test it. Once you are happy, create a tarball from test and untar in deployer and deploy it to SHC.

This way, you will have all your changes in version control and will use a simple scripting/tar for all the apps and control the way its deployed reducing manual approach.

Ultra Champion

That is not really a Splunk question I guess. There is all kinds of configuration management / deployment automation tools available to do such things.

You could set up some scripts on the deployer to pull from some code repository, you can use deployment mechanisms from tools like Microsoft VSTS to push code out to the deployer, you can use tools like ansible to automate deployments...

0 Karma


Thanks for the reply FrankVI.

I agree this is not a Splunk question, but I wanted to know from guys here, what are the best tools to automate this stuff and what other guys are doing in this case ? If somebody can suggest some tools which are already working in their environment along with initial configurations, it would be easy for me to deploy the same in my environment ?

0 Karma

@pgadhari, did you get some solution or idea for this? We are also planning to implement similar thing.
Get Updates on the Splunk Community!

Introducing Ingest Actions: Filter, Mask, Route, Repeat

WATCH NOW Ingest Actions (IA) is the best new way to easily filter, mask and route your data in Splunk® ...

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...