I'm having a problem where I have 5 indexers and 1 search head. All 5 show up in the search peers under distributed search. I've verified through the metrics.log that the indexer is receiving data. When I perform a search however, I only see events from 4 of the indexers. I performed "index=* | stats count by splunk_server" and again only 4 indexers + the search head showed up. Has anyone seen this issue before? Thank you in advance for any help or direction that can be provided.
Well, I feel like a goof. It looks like it is a heavy forwarder (has a outputs.conf). I don't understand why it looks like a search peer and showing up in SoS, but you learn something everyday. Thanks to everyone who took the time to respond.
Well, I feel like a goof. It looks like it is a heavy forwarder (has a outputs.conf). I don't understand why it looks like a search peer and showing up in SoS, but you learn something everyday. Thanks to everyone who took the time to respond.
If you run this search, how many peers return count?
index=_internal earliest=-5m@m | stats count by splunk_server
This should give responses from all your indexers, and if you have your SH / Component boxes configured to forward their internal logs, those also.
If you're missing an indexer, try running the same search from the cluster master. If that doesnt return results, i would try rejoining the indexer to the cluster.
Are there any license errors recorded for this indexer? Check your license server console.
couple of questions..
Is the indexer in question up and running?
Did it ever reported back the data?
If not, is it actually getting the data from your forwarders? Check the outputs.conf if the indexer name/ip is mentioned correctly
It is up and running. I'm not sure if it has reported in the past as I'm new to this system. It is receiving 20% of the data. This was verified through the metrics.log and SoS.