Deployment Architecture

Why am I getting errors on a search head with all their indexer IPs?

smdasim
Explorer

Hi Team,

I have configured a Splunk cluster with 1 search head ,1 cluster master and 1 license master/DS/MC and 3 indexers and 2 forwarders . (simple Splunk cluster environment )

I am not able to get data from forwarder to search head. All the configuration looks fine to me

Search query :
index=_internal log_level=ERROR

10-08-2018 11:51:50.723 +0000 ERROR TcpInputProc - Message rejected. Received unexpected message of size=1195725856 bytes from src=83.65.96.175:50378 in streaming mode. Maximum message size allowed=67108864. (::) Possible invalid source sending data to splunktcp port or valid source sending unsupported payload.

10-08-2018 11:47:19.866 +0000 ERROR TcpInputProc - Message rejected. Received unexpected message of size=1195725856 bytes from src=177.188.19.79:46014 in streaming mode. Maximum message size allowed=67108864. (::) Possible invalid source sending data to splunktcp port or valid source sending unsupported payload.

Any help will be appreciated .
thanks
Regards
smdasim

Tags (1)
0 Karma

hettervik
Builder

Hi smdasim! Did you find out the exact reason for the error messages? I've encountered the same issue when trying to send uncooked data from my deployment server to my indexer. It workes fine if sending the data cooked, but when I add "sendCookedData=false" to outputs.conf, the errors appear immediately.

0 Karma

ssadanala1
Contributor

Hi ,

Please look at answer below Hope if answers you question

https://answers.splunk.com/answers/438218/error-tcpinputproc-message-rejected-received-unexp.html

Please check tcp ports on indexers and validate output config on forwarder has same port on forwarder as well .

Make sure that indexer has tcp port enabled .

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...