Getting Data In

Forwarder not indexing

rcovert
Path Finder

I have one Linux indexer and 2 Linux forwarders. I just moved my indexer to a new server and have everything set up again. I changed the receiving server in both of my forwarders in /opt/splunkforwarder/etc/system/local/outputs.conf to point to the new IP address.

In the Deployment monitor app, I see both forwarders and it looks like data is coming in from both of them. But, when I look in the search app, it is not showing data coming from one of the forwarders under hosts. Any ideas?

0 Karma
1 Solution

rcovert
Path Finder

I found the answer. Grr..

I had a extra space between a ":" and the IP address of the indexer.

View solution in original post

0 Karma

rcovert
Path Finder

I found the answer. Grr..

I had a extra space between a ":" and the IP address of the indexer.

0 Karma

rcovert
Path Finder

This is being repeated in the splunkd.log on the forwarder:

06-05-2012 14:22:35.044 -0400 ERROR pipeline - Runtime exception in pipeline: parsing, processor: tcp-output-light-forwarder, error: vector::_M_range_check
06-05-2012 14:22:35.044 -0400 ERROR splunklogger - Uncaught exception in pipeline execution (tcp-output-light-forwarder) - getting next event

index="_internal" source="/Applications/Splunk/splunk/var/log/splunk/splunkd.log" shows 0 results.

0 Karma

sdaniels
Splunk Employee
Splunk Employee

Are you seeing anything in your splunkd log?
/var/log/splunk

or in the UI via this search

index="_internal" source="/Applications/Splunk/splunk/var/log/splunk/splunkd.log"

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...