Getting Data In

Universal Forwarder not displaying data on SplunkWeb on another server

atewari
Path Finder

We have two Linux servers using Splunk 5.0.1 on 64-bit.

  1. A full Splunk install (SplunkD and SplunkWeb). We created a Receive data port 8001 on this.
  2. Universal Forwarder on second Linus server. We added a forwarder using command
    
    splunk add forward-server server1:8001 -auth admin:somepassword
    
    

It was successfully added. We restarted the forwarder on server2. So we ran the command



splunk list forward-server

This command showed that server1:8001 was added but was not active. When we ran the list command again it said the file was locked. The metrics.log file says it is connected successfully.

But how do we view server2 data on SplunkWeb running on server1? We added the *nix App, but it cannot see server2 selection anywhere. We only see server1 info. Is there another step to activate the forwarder on server2 and enable something on server1 to view server2 logs?

The "deploy" forwarder documentation is confusing. It gives a few commands and then asks to test the deployment of the forwarder without instructions on what to test.

Can anyone point us to the next steps - links, answers, anything?

Thanks

Tags (1)
0 Karma

atewari
Path Finder

Thanks Drainy! Here is a complete rundown of the conf files.


Universal Forwarder on uf.xyz.com:8001

local/inputs.conf
==================
[default]
host = uf.xyz.com

local/outputs.conf
==================
[tcpout]
defaultGroup = default-autolb-group
disabled = false

[tcpout:default-autolb-group]
server = fullinstall.xyz.com:8001

[tcpout-server://fullinstall.xyz.com:8001]



Now, the files for splunk full install



Splunk full install fullinstall.xyz.com:8001

local/inputs.conf

[default]
host = fullinstall.xyz.com
disabled = 0
index = summary

[tcp://uf.xyz.com:8001]
disabled = 0
index = summary

local/outputs.conf

[tcpout]
indexAndForward = 0

Here is the splunkd.log on uf.xyz.com. It does connect to fullinstall.xyz.com and the uf.xyz.com also sends data to it



11-30-2012 02:00:49.012 -0600 INFO TailingProcessor - Parsing configuration stanza: batch://$SPLUNK_HOME/var/spool/splunk/...stash_new.
11-30-2012 02:00:49.012 -0600 INFO TailingProcessor - Parsing configuration stanza: monitor://$SPLUNK_HOME/etc/splunk.version.
11-30-2012 02:00:49.012 -0600 INFO TailingProcessor - Parsing configuration stanza: monitor://$SPLUNK_HOME/var/log/splunk.
11-30-2012 02:00:49.012 -0600 INFO TailingProcessor - Parsing configuration stanza: monitor://$SPLUNK_HOME/var/log/splunk/splunkd.log.
11-30-2012 02:00:49.012 -0600 INFO BatchReader - State transitioning from 2 to 0 (initOrResume).
11-30-2012 02:00:49.028 -0600 INFO TcpOutputProc - Connected to idx=xxx.xxx.xxx.xxx:8001


Any suggestions would be much appreciated.

Thanks!

0 Karma

aneeshkjm123
Path Finder

Check your port 8001 in the forwarder is blocked. Disable firewall and try if you are not able to unblock the port

0 Karma

Drainy
Champion

Ok, lets rewind a little. Could you post your inputs.conf from the indexer and the outputs.conf from the forwarder?

Something worth remembering, you need to define inputs (in inputs.conf) before the forwarder will send anything, also if you define a different index to default in outputs.conf on the forwarder then the summary page won't show any detail from it.

0 Karma

DaveSavage
Builder

Atewari - note Drainy's comment here re different index in outputs.conf. You currently don't have that in the rundown below. Note especially that *nix uses the index 'os' and all canned searches are pre-disposed to that source!
Br and good luck. Dave

0 Karma

atewari
Path Finder

We cannot see the host server that runs the universal forwarder on the summary page on the right side. On server1 (SplunkWeb full install), we changed the local/inputs.conf under [default] stanza and changed the host = it then showed two servers. But it still collects the main server1 data only.

Is there anything else we need to do? We basically changed the outputs.conf on server2 (with universal forwarder) and the inputs.conf (full install). We know that connection is established because the metrics.log shows receiving data,

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...