All Apps and Add-ons

Why is one Linux universal forwarder host and data not showing up in Splunk Light?

rlorenzon
Explorer

I have Splunk Light v6.2.3 instance with the Add-on for Nix V5.1.2 running. I have two universal forwarders v6.2.1 identically configured on two different Red Hat servers running the same operating system. I want to send all the logs from `/var/log/` to splunk. One server works and one server doesn't. I have verified connectivity from both servers. The one that doesn't work is slated to be the production instance of the one that works.

etc/system/local/inputs.conf:

[default]
host = host1
index = syslog
disabled = false
[monitor:///var/log]

etc/system/local/outputs.conf:

[tcpout]
defaultGroup = linux-group
disabled = 0
[tcpout:linux-group]
server = ##.##.##.32:514
[tcpout-server://##.##.##.32:514]

Pretty basic config. I have a Windows forwarder which also works fine.

I think I've read every Splunk doc there is and run every diagnostic I could. I've seen posts of others with a similar issue here and have verified every one of those answers marked as correct. There are some mention of associating the host with an index, but the steps only apply to the enterprise version. I can see data coming from both hosts in the receiver logs, it just doesn't show up in the interface. I've tried using different ports, reinstalling, and tried various versions just in case. Any help would be greatly appreciated!

0 Karma
1 Solution

rlorenzon
Explorer

Followed this great step by step document again:.

http://answers.splunk.com/answers/50082/how-do-i-configure-a-splunk-forwarder-on-linux.html

Was failing on step 7 as others have had. I manually created the directory:
/opt/splunkforwarder/etc/apps/search/local

and the inputs.conf file in it with:
[monitor:///var/log]
disabled = false

and it worked!

View solution in original post

rlorenzon
Explorer

Followed this great step by step document again:.

http://answers.splunk.com/answers/50082/how-do-i-configure-a-splunk-forwarder-on-linux.html

Was failing on step 7 as others have had. I manually created the directory:
/opt/splunkforwarder/etc/apps/search/local

and the inputs.conf file in it with:
[monitor:///var/log]
disabled = false

and it worked!

aljohnson_splun
Splunk Employee
Splunk Employee

@rlorenzon glad to hear you figured it out. Go ahead and accept your own answer to close the question. Thanks!

rlorenzon
Explorer

Also, the hostnames are different as well as their GUID's in etc/instance.cfg - They were clean installations.

0 Karma

rlorenzon
Explorer

Also, just noticed this message in the splunk interface with the hostname of the one that I can't see:

received event for unconfigured/disabled/deleted index='syslog' with source='source::/var/log/dmesg.old' host='host::infoleaf' sourcetype='sourcetype::backup_file' (1 missing total)

0 Karma

aljohnson_splun
Splunk Employee
Splunk Employee

So it sounds like your syslog index is all begarbled - is there anyway you can delete the syslog index and create a new one ? That is just what I would do, not necessarily a solution 😛

0 Karma

satishsdange
Builder

I am suspecting problem with your inputs.conf [monitor:///var/log]
Shouldn't it be [monitor:///var/log/*] ?

0 Karma

aljohnson_splun
Splunk Employee
Splunk Employee

If you look at the example here, it should be valid without the *.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...