Getting Data In

Why am I getting incorrect results from btool during diagnostics for a Splunk 6.3.1 Windows universal forwarder?


When running the btool on the inputs.conf files on a Windows universal forwarder (v6.3.1), the results appear to be incorrect and this is making it difficult to find the root of my original issue. The purpose of the diagnostics is to disable unused inputs such as MonitorNoHandle, RegMon etc.

The issue appears to be reproducible if you install a completely new version of the UFW and then run btool --debug. Two examples of the problem are
1. The majority of the standard windows stanza's are listed as coming from the splunk_httpinput app (which is installed by default in the UFW.
2. The inputs file from the splunk_httpinput only contains the [http] stanza, but all the values from that stanza are listed as if global default settings.

For example the following output from the btool for the MonitorNoHandle stanza. The first line suggests this stanza is from the splunk_httpinput app, but is actually defined in the system-default-inputs.conf. Also the line port = 8088 is showing in this as coming from the splunk_httpinput app - which is correct, but this is only defined under the HTTP stanza in that file. So it is strange that this is being taken as a global setting for another stanza.

Is there any explanation for this? or is btool just not 100% accurate? I can reproduce this issue by taking a completely standard UFW MSI file and installing on any Windows server. So does not appear to be a individual issue with my servers.

c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf        [MonitorNoHandle]
c:\Program Files\SplunkUniversalForwarder\etc\system\default\inputs.conf                        _rcvbuf = 1572864
c:\Program Files\SplunkUniversalForwarder\etc\system\default\inputs.conf                        baseline = 0
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         dedicatedIoThreads = 2
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         disabled = 1
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         enableSSL = 1
c:\Program Files\SplunkUniversalForwarder\etc\system\default\inputs.conf                        evt_dc_name = 
c:\Program Files\SplunkUniversalForwarder\etc\system\default\inputs.conf                        evt_dns_name = 
c:\Program Files\SplunkUniversalForwarder\etc\system\default\inputs.conf                        evt_resolve_ad_obj = 0
host = win2k8r2
index = default
c:\Program Files\SplunkUniversalForwarder\etc\system\default\inputs.conf                        interval = 60
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         maxSockets = 0
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         maxThreads = 0
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         port = 8088
c:\Program Files\SplunkUniversalForwarder\etc\apps\splunk_httpinput\default\inputs.conf         useDeploymentServer = 0

Splunk Employee
Splunk Employee

If you think btool is broken I recommend you log a support case and get this look at.

The most common reason for this to happen is that you have an inconsistent configuration which btool can't properly interpret.
We see this commonly where users are switching between Windows and Linux systems.
I recommend reviewing your .conf files for the presence of characters like ^M (carriage return).

More info on this here:

0 Karma


Hi, The original issue occurred with a completely default install of the forwarder, so no custom configurations. I haven't checked it since, but noticed that several bugs were fixed with btool in version 6.4.6. So this issue may be resolved with a newer version.

2017-01-26 SPL-119992, SPL-130547, SPL-130548 btool returns incorrect output for inputs.conf with certain modular and scripted input combinations

0 Karma


Btool is broken - it has been displaying incorrect (random) paths to config entries since at least 6.3. Not sure why Splunk has not fixed this, since it is a rather critical tool...

Anyone from Splunk care to comment?

0 Karma

Path Finder

Linux Heavy Forwarder (v6.3.4 and v6.4.1) is also showing this issue.

Commenting out the entire stanza in splunk_httpinput appears to resolve the issue. (Luckily I don't use splunk_httpinput at this time, so that's a viable option - along with probably moving the file to an alternate name (not tested).

Not sure how to update a fleet of over 1000 servers cleanly and neatly with this change - don't really want to put a default app in Deployment Server (which would appear to be the easiest answer) - I've been warned that this is not best practice...

0 Karma


don't really want to put a default app in Deployment Server

What do you mean with the term default app?

Why should it be a problem to use a default directory inside the app? The deployment server will overwrite everything - the entire app folder on the client.
Basically, the client creates a hash of the folder, and the deployment server creates a hash of the folder. If the system checks the hashes and finds they are different the server sends the app folder to the client.

This works differently if you look at the deployment within a SHC.

0 Karma

Path Finder

Because of the App Management Issues section as per Create deployment apps

Basically - the splunk_httpinput app is installed with the splunk itself - and if it ends up containing version dependent code, then my environment is sunk (as serverclass.conf doesn't filter on splunk version at this time). And given adding the app to DS is irreversible, the 'damage' can't be undone.

0 Karma
Get Updates on the Splunk Community!

Streamline Data Ingestion With Deployment Server Essentials

REGISTER NOW!Every day the list of sources Admins are responsible for gets bigger and bigger, often making the ...

Remediate Threats Faster and Simplify Investigations With Splunk Enterprise Security ...

REGISTER NOW!Join us for a Tech Talk around our latest release of Splunk Enterprise Security 7.2! We’ll walk ...

Introduction to Splunk AI

WATCH NOWHow are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. ...