All Apps and Add-ons
Highlighted

Best practices to deploy the S.o.S app in a distributed search environment

Splunk Employee
Splunk Employee

I have a Splunk deployment with one search-head and multiple indexers and I would like to install the Splunk on Splunk app to analyze and troubleshoot problems on these instances. Were should I install the app? Are there other components that need to be installed and configured?

Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Splunk Employee
Splunk Employee

Starting with version 2.0, the Splunk on Splunk app (a.k.a SoS) leverages distributed search to allow introspective analysis and troubleshooting on your deployment from a centralized location.

Let's cover the details of deploying the SoS app, F.A.Q-style:

- Where should SoS be installed?

The SoS app should be installed on your search-head.

- What if I have more than one search-head?

SoS can only analyze and report on Splunk instances that it can reach by means of distributed search. Unless you have a federating search-head for which the other search-heads are declared as search-peers, you should install SoS on each search-head that you wish to gain introspection on.

- What about search-head pooling? Any extra steps I need to take to install SoS on a search-head pool?

You will almost certainly need to manually restart Splunk on the pool members other than the one where SoS was installed from the user interface in order for all pool members to become aware of the app.

- Are there any requirements for SoS to work?

The SoS app depends on the UI module library of the Sideview Utils app. Make sure you install SVU (minimum version required : 1.17) prior to SoS.

- Does anything need to be installed on the search peers?

Optionally, you can install the SoS technology add-on on your Unix and Linux search peers. This add-on provides scripted inputs that track the resource usage (CPU, memory, file descriptors) of Splunk.

- Do I also need to install the SoS technology add-on on the search-head?

No, that would be redundant. The SoS app ships with the same data inputs than the technology add-on.

- What do I need to do for the SoS data inputs to track Splunk resource usage?

By default, the data inputs for SoS and SoS-TA are disabled. They must be enabled in the UI (look for pssos.sh and lsofsos.sh in Manager > Data Inputs > Scripts) or from the command line (refer to the README file for more information on this) before they can report resource usage information.

- What if my search-head is configured to forward data back to the search peers? Any action required for the SoS app to work in that scenario?

If the search-head where the SoS app is installed forwards its own events to the indexers it queries (typically, to spray summarized events back to the indexers) you have to make sure that events for the internal index are also forwarded. This is not the case by default due to the forwardedindex filters in outputs.conf, which prevents internal events from being forwarded. To correct this, add the following configuration to $SPLUNK_HOME/etc/apps/sos/local/outputs.conf:

[tcpout]
forwardedindex.3.whitelist = _internal

- What about my heavy forwarder? Can SoS query and report on it?

Yes. Please see this Splunk Answer for details on how to achieve this.

- How can I make sure that SoS reports accurate license usage information in the Metrics view?

1. Make sure to install SoS on the search-head that acts as a license master, or at least that the license master's internal index can be accessed by distributed search.

2. [Splunk 4.2.x only] Make sure to enable the scheduled search named sos
summaryvolumedaily in Manager > Searches and Reports. This search populates the sossummarydaily summary index, aggregating your license usage on a daily basis.

3. [Splunk 4.2.x only] Backfill the summary index "sossummarydaily" by running the following command:

$SPLUNKHOME/bin/splunk cmd python $SPLUNKHOME/bin/fillsummaryindex.py -app sos -name sossummaryvolume_daily -et -28d -lt -1d -j 10 -owner admin

Note: You do not need to perform steps #2 and #3 if you are running Splunk 4.0.x/4.1.x or 4.3.x.

View solution in original post

Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Path Finder

Hi,

Any special instructions for upgrading the App and the TA? I am currently using 2.1.

  • just untar the file over the old version?
  • delete old version and create new? Can I take over the settings somehow from the previous version?
  • I have 3 SHs and 18 IDX so please don't tell me to use GUI to upgrade... 🙂 Automated process is highly preferred.

Regards,
Bartosz

0 Karma
Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Splunk Employee
Splunk Employee

@tzhmaba2 : Untarring over the old version is a perfectly fine method to upgrade both the app and its TA, as long as you restart splunkd everywhere and Splunkweb on the search-heads to complete the upgrade.

0 Karma
Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Ultra Champion

I have the SoS app installed on my search head pool.

I also have the search head splunk server logs ( from var/log/splunk that get indexed into _internal) being forwarded into the indexer cluster.

Sideview Utils is required.

All working great, awesome insights to be had.

Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Explorer

Care to share your configuration? I'm trying to figure out how to index the logs from my search head pool on my indexers.

0 Karma
Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Splunk Employee
Splunk Employee

@mikehale0 : Did my suggestion of adding the following configuration to $SPLUNK_HOME/etc/apps/sos/local/outputs.conf not work?
[tcpout]
forwardedindex.filter.disable = true

0 Karma
Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Explorer

@hexx I did try that. Upon further investigation it looks like the search heads are having trouble connecting to the indexers:

01-26-2012 18:19:40.635 +0000 WARN  TcpOutputProc - Applying quarantine to idx=x.x.x.x:8089 numberOfFailures=5
0 Karma
Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Splunk Employee
Splunk Employee

@mikehale0 : Has forwarding from the search-head to the indexers ever worked? Was this failure introduced when adding the suggested configuration change?

0 Karma
Highlighted

Re: Best practices to deploy the S.o.S app in a distributed search environment

Explorer

No, it never was working. I needed to add
[splunktcp://9997]
to inputs.conf on the indexers and open up the firewall.

0 Karma