All Apps and Add-ons

Best practices to deploy the S.o.S app in a distributed search environment

hexx
Splunk Employee
Splunk Employee

I have a Splunk deployment with one search-head and multiple indexers and I would like to install the Splunk on Splunk app to analyze and troubleshoot problems on these instances. Were should I install the app? Are there other components that need to be installed and configured?

1 Solution

hexx
Splunk Employee
Splunk Employee

Starting with version 2.0, the Splunk on Splunk app (a.k.a SoS) leverages distributed search to allow introspective analysis and troubleshooting on your deployment from a centralized location.

Let's cover the details of deploying the SoS app, F.A.Q-style:

- Where should SoS be installed?

The SoS app should be installed on your search-head.

- What if I have more than one search-head?

SoS can only analyze and report on Splunk instances that it can reach by means of distributed search. Unless you have a federating search-head for which the other search-heads are declared as search-peers, you should install SoS on each search-head that you wish to gain introspection on.

- What about search-head pooling? Any extra steps I need to take to install SoS on a search-head pool?

You will almost certainly need to manually restart Splunk on the pool members other than the one where SoS was installed from the user interface in order for all pool members to become aware of the app.

- Are there any requirements for SoS to work?

The SoS app depends on the UI module library of the Sideview Utils app. Make sure you install SVU (minimum version required : 1.17) prior to SoS.

- Does anything need to be installed on the search peers?

Optionally, you can install the SoS technology add-on on your Unix and Linux search peers. This add-on provides scripted inputs that track the resource usage (CPU, memory, file descriptors) of Splunk.

- Do I also need to install the SoS technology add-on on the search-head?

No, that would be redundant. The SoS app ships with the same data inputs than the technology add-on.

- What do I need to do for the SoS data inputs to track Splunk resource usage?

By default, the data inputs for SoS and SoS-TA are disabled. They must be enabled in the UI (look for ps_sos.sh and lsof_sos.sh in Manager > Data Inputs > Scripts) or from the command line (refer to the README file for more information on this) before they can report resource usage information.

- What if my search-head is configured to forward data back to the search peers? Any action required for the SoS app to work in that scenario?

If the search-head where the SoS app is installed forwards its own events to the indexers it queries (typically, to spray summarized events back to the indexers) you have to make sure that events for the _internal index are also forwarded. This is not the case by default due to the forwardedindex filters in outputs.conf, which prevents _internal events from being forwarded. To correct this, add the following configuration to $SPLUNK_HOME/etc/apps/sos/local/outputs.conf:

[tcpout]
forwardedindex.3.whitelist = _internal

- What about my heavy forwarder? Can SoS query and report on it?

Yes. Please see this Splunk Answer for details on how to achieve this.

- How can I make sure that SoS reports accurate license usage information in the Metrics view?

1. Make sure to install SoS on the search-head that acts as a license master, or at least that the license master's _internal index can be accessed by distributed search.

2. [Splunk 4.2.x only] Make sure to enable the scheduled search named sos_summary_volume_daily in Manager > Searches and Reports. This search populates the sos_summary_daily summary index, aggregating your license usage on a daily basis.

3. [Splunk 4.2.x only] Backfill the summary index "sos_summary_daily" by running the following command:

$SPLUNK_HOME/bin/splunk cmd python $SPLUNK_HOME/bin/fill_summary_index.py -app sos -name sos_summary_volume_daily -et -28d -lt -1d -j 10 -owner admin

Note: You do not need to perform steps #2 and #3 if you are running Splunk 4.0.x/4.1.x or 4.3.x.

View solution in original post

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...