Deployment Architecture

Problem trying to use deployment-apps

smellpit
Explorer

I am fresh to Splunk Enterprise so I must be missing something simple. To summarize: what does a deployment application actually do? I see the "deployment": the files are copied from the search head/DS to the specified forwarder(s). Everything about "Forwarder management" appears to be working as expected, being configured with the web GUI and/or the CLI files. I have separate Linux instances for 6.5.1 search head, indexers, forwarders, and have been through the Updating SE Instances 6.5.1 PDF several times.

To rephrase, the ./deployment-apps files are deployed but then what? Descriptions mention deployment of arbitrary content, script and configuration files. Setting aside the if/when/how for running(?) script files, configuration files are the concern here. I've seen text and other Answers posts that imply the deployment of, say, an inputs.conf file to a forwarder will affect the data it looks at, ingests. I can see how that configuration deployment would make sense when there are many forwarders that want to do about the same thing

Details here:
* Deployment server & clients show as connected with ./splunk CLI show/list deploy commands.
* Automated deployment and manual "reload deploy-server" have both been done.
* As said earlier, the files are being deployed to the clients.
* Fresh "qwerty" index.
* An inputs.conf is set-up & deployed to a forwarder for an existing log file:

Deployment server
cat $SPLUNK_HOME/etc/deployment-apps/qwerty-app/default/inputs.conf
[monitor:///opt/log/junk]
disabled = false
index = qwerty
whitelist = *.log

Forwarder
[splunk qwerty-app]$ cat /opt/splunkforwarder/etc/apps/qwerty-app/default/inputs.conf
[monitor:///opt/log/junk]
disabled = false
index = qwerty
whitelist = *.log

Log file on forwarder
[splunk]$ ls -al /opt/log/junk/*
-rw------- 1 splunk splunk 864674 Jan 13 22:35 /opt/log/junk/crunch.log

Shouldn't the new "qwerty" index now have crunch.log data? The forwarder splunkd process was set to restart following the app deployment; restarting it and the search head again made no difference. I've also tried the conf with & without the 'host' parameter (didn't think it'd matter).

Any assistance would be greatly appreciated.

0 Karma
1 Solution

smellpit
Explorer

The problem was something simple: once I started using ./local instead of ./default the .conf files have been working as expected. I know about the precedence rules & the general "don't use ./default" rule-of-thumb. I still thought ./default would be OK since it is in the hierarchy & there are Splunk PDF deployment-app examples that use ./default. And I'd think that at least the first ./default deployment would do something as opposed to apparently nothing (at least for .conf files).

Thanks everyone for your help & input.

View solution in original post

0 Karma

smellpit
Explorer

The problem was something simple: once I started using ./local instead of ./default the .conf files have been working as expected. I know about the precedence rules & the general "don't use ./default" rule-of-thumb. I still thought ./default would be OK since it is in the hierarchy & there are Splunk PDF deployment-app examples that use ./default. And I'd think that at least the first ./default deployment would do something as opposed to apparently nothing (at least for .conf files).

Thanks everyone for your help & input.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi smellpit,
I agree with the other friends, to better understand the process see http://docs.splunk.com/Documentation/Splunk/6.5.1/Troubleshooting/Cantfinddata.
Every way, first check you have to do is to verify if you receive logs from that server

index=_internal host=yourhost

id not, verify if routes and ports are open (telnet IP_SPLUNK_SERVER 9997 from the forwarder)
if yes check if the monitor command is correct /opt/log/junk/*.logand if you receive something index=* host=yourhost with time period=always.
I noted that you didn't set a sourcetype, so verify if events are acquired with the correct timestamp (maybe you received them but with a wrong timestamp and you don't see them).
Bye.
Giuseppe

0 Karma

smellpit
Explorer

I had seen "I can't find my data!" before but this time it (almost) fits. Briefly (well, not, but frustration wants to answer everything):

  • Are you running Splunk Free? No, trial license. And the indexers aren't slaving the master license but that should not matter (see next "Example case").
  • Was the data added to a different index? "Example case" (a Splunk example) delivers to "main" but my test cases haven't delivered to other indexes.
  • Do your permissions allow you to see the data? All Linux-sample log files are splunk:splunk (processes user:group).
  • What about issues related to time? Cannot see a relation: "All time" works, but only(!?) for the "Example case" /var/log/messages file.
  • Are you using forwarders? Yes, which works fine w/ "Example case(!)" log file.
  • Are you using search heads? Yes, and searches do see new and existing indexer data.
  • Are you still logged in and under your license usage? Yes.
  • Are you using a scheduled search? No.
  • Check your search query; Are you using NOT, AND, or OR? Don't need to: 'index="whatever"' is enough to check (un)available data.
  • Are you extracting fields? Doesn't need to get that far.

This must be something simple or (very) esoteric; thanks in advance.

0 Karma

ddrillic
Ultra Champion

It's good to start with - I can't find my data!

in general, instead of - SPLUNK_HOME/etc/deployment-apps/qwerty-app/default/inputs.conf use please SPLUNK_HOME/etc/deployment-apps/qwerty-app/local/inputs.conf.

Do you have the corresponding outputs.conf?

About Log file on forwarder
[splunk]$ ls -al /opt/log/junk/*
-rw------- 1 splunk splunk 864674 Jan 13 22:35 /opt/log/junk/crunch.log

does tail /opt/log/junk/crunch.log work?

Do you see any errors on the forwarder at /opt/splunkforwarder/var/log/splunk/splunkd.log?

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...