Deployment Architecture

deployment questions - outputs.conf

edchow
Explorer

Hi there,

I am using splunk for free at the moment whilst learning.

I am setting up a deployment server and clients but having some problems.

[tcpout]
defaultGroup = splunk

[tcpout:splunk]
server = indexer1:9997,indexer2:9997

I can see the client has successfully downloaded the outputs.conf file from the deployment server:

/opt/splunkforwarder/etc/apps/secure/outputs.conf

I am noticing the following errors and warnings from splunkd.log

08-27-2012 12:50:00.342 +0000 ERROR IniFile - Cannot open ini file for parsing: No such file or directory
08-27-2012 12:50:00.296 +0000 ERROR ConfObjectManagerDB - Cannot initialize: /opt/splunkforwarder/etc/apps/secure/metadata/local.meta: No such file or directory
08-27-2012 12:50:00.274 +0000 WARN DeployedApplication - Installing app: webservers to location: /opt/splunkforwar
der/etc/apps/secure

08-27-2012 12:49:41.907 +0000 INFO TailingProcessor - Could not send data to output queue (parsingQueue), retrying
...
08-27-2012 12:49:36.779 +0000 ERROR TcpOutputProc - LightWeightForwarder/UniversalForwarder not configured. Please
configure outputs.conf.

08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting masterGuid='78A0ADBD-7476-4C9D-9ABF-66BEB98670D6'
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=SyslogOutputProcessor state=ENABLED (fs=1)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=SplunkWeb state=ENABLED (fs=1)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=SigningProcessor state=ENABLED (fs=1)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=ScheduledSearch state=DISABLED_DUE_TO_LICENSE (fs=2
)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=ResetWarnings state=DISABLED_DUE_TO_LICENSE (fs=2)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=RcvSearch state=DISABLED_DUE_TO_LICENSE (fs=2)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=RcvData state=ENABLED (fs=1)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=LocalSearch state=DISABLED_DUE_TO_LICENSE (fs=2)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=FwdData state=ENABLED (fs=1)
08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=DistSearch state=DISABLED_DUE_TO_LICENSE (fs=2)
08-27-2012 12:49:36.157 +0000 INFO LMTracker - setting feature=DeployServer state=DISABLED_DUE_TO_LICENSE (fs=2)
08-27-2012 12:49:36.157 +0000 INFO LMTracker - setting feature=DeployClient state=ENABLED (fs=1)
08-27-2012 12:49:36.157 +0000 INFO LMTracker - setting feature=CanBeRemoteMaster state=DISABLED_DUE_TO_LICENSE (fs
=2)
08-27-2012 12:49:36.157 +0000 INFO LMTracker - setting feature=Auth state=ENABLED (fs=1)
08-27-2012 12:49:36.157 +0000 INFO LMTracker - setting feature=AllowDuplicateKeys state=DISABLED_DUE_TO_LICENSE (f
s=2)
08-27-2012 12:49:36.157 +0000 INFO LMTracker - setting feature=Alerting state=DISABLED_DUE_TO_LICENSE (fs=2)
08-27-2012 12:49:36.157 +0000 INFO LMSlaveInfo - new slave='78A0ADBD-7476-4C9D-9ABF-66BEB98670D6' created
08-27-2012 12:49:36.157 +0000 INFO LMTracker - attempting to ping master=self from slave=78A0ADBD-7476-4C9D-9ABF-6
6BEB98670D6

I am wondering if this has anything to do with the fact that it is a free version of splunk or if there is something wrong with my outputs.conf configuration.

Any ideas?

Tags (2)
0 Karma
1 Solution

lguinn2
Legend

Good catch on the local.meta, but the problem may be larger than that - yet easy to fix.

Splunk expects the deployment apps to follow the same directory structure as "regular" apps. Especially true because the deployments apps become "regular" apps on the target machines.

The easiest way to make sure that you have the structure, and the basic files that you need, is to

Go to Splunk Manager. Under Apps, create a new app. Give it a name (like "secure").

In the menu choices, make it not visible. Choose the "barebones" template.

Now, under /etc/apps, you will have a subdirectory called secure. It will have the structure that you need, and a valid local.meta file and an app.conf file.

Add your outputs.conf under /etc/apps/secure/default or /etc/apps/secure/local (either will work).

Move /etc/apps/secure to /etc/deployment-apps/secure

splunk reload deploy-server

I think this will solve your problem. Or at least most of it!

View solution in original post

lguinn2
Legend

Good catch on the local.meta, but the problem may be larger than that - yet easy to fix.

Splunk expects the deployment apps to follow the same directory structure as "regular" apps. Especially true because the deployments apps become "regular" apps on the target machines.

The easiest way to make sure that you have the structure, and the basic files that you need, is to

Go to Splunk Manager. Under Apps, create a new app. Give it a name (like "secure").

In the menu choices, make it not visible. Choose the "barebones" template.

Now, under /etc/apps, you will have a subdirectory called secure. It will have the structure that you need, and a valid local.meta file and an app.conf file.

Add your outputs.conf under /etc/apps/secure/default or /etc/apps/secure/local (either will work).

Move /etc/apps/secure to /etc/deployment-apps/secure

splunk reload deploy-server

I think this will solve your problem. Or at least most of it!

edchow
Explorer

Thanks! I was not putting them in the correct directory structure, thank you very much for your help.

0 Karma

kristian_kolb
Ultra Champion

Hi,

You should be able to have a forwarder send data to a 'free' indexer. As you can see in the splunkd.log above, 08-27-2012 12:49:36.158 +0000 INFO LMTracker - setting feature=FwdData state=ENABLED (fs=1). The forwarder has no licensing issues with forwarding data. Actually, I believe that all splunk instances can send and receive data, though in the case of forwarders, they cannot index it locally, and hence must send it further along to an indexer.

As for the errors regarding local.meta, I believe that this can be solved by creating (on the DS) this file in the right directory;

$SPLUNK_HOME/etc/deployment-apps/your-app/metadata/local.meta

It does not have to contain anything, but it must exist.


UPDATE:

Cleary lguinn has a more 'proper' and structured approach for creating apps in general, but then again, she works for the company 🙂

I was wondering however about

a) your outputs.conf - is that the full outputs you pasted above? Could you verify that the runtime settings on the forwarder are the intended? To check this, type (on the forwarder)

$SPLUNK_HOME/bin/splunk btool outputs list --debug

This instructs Splunk to 'compile' the various outputs.conf files on your system according to the configuration file precedence rules. If the result shows configuration parameter values that you did not expect/deploy, it means that they are overridden by another outputs.conf. The first column will tell you where to look (system for etc/system/local or /etc/system/default, secure for your new application).

b) the third error in your splunkd.log (deploying app: webserver to /etc/apps/secure) looks .. well .. odd. Are you sure that your serverclass.conf on the DS is in order?

Hope this helps,

Kristian

kristian_kolb
Ultra Champion

see update /k

0 Karma

edchow
Explorer

Thanks for that, great to know that licensing is not part of the problem and that got rid of the local.meta error.

However, I still cannot see any data being indexed, and te following error is still present.

08-27-2012 13:39:25.513 +0000 INFO TailingProcessor - Could not send data to output queue (parsingQueue), retrying
...
08-27-2012 13:39:20.482 +0000 ERROR TcpOutputProc - LightWeightForwarder/UniversalForwarder not configured. Please
configure outputs.conf.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...