Hello Splunk team...
I am facing this issue while we run any searches on my splunk setup., can you help me on how we can fix this..
02-29-2024 06:58:53.370 ERROR DispatchThread [4125 phase_1] - code=10 error=""
02-29-2024 06:58:53.370 ERROR ResultsCollationProcessor [4125 phase_1] - SearchMessage orig_component=ResultsCollationProcessor sid=1709189933.399443_**** message_key=DISPATCHCOMM:PEER_PIPE_EXCEPTION__%s message=Search results might be incomplete: the search process on the peer: ended prematurely. Check the peer log, such as $SPLUNK_HOME/var/log/splunk/splunkd.log and as well as the search.log for the particular search.
Thank you..
@kiran_panchavat...
Sorry for the late reply...we tried checking all the steps..peerlogs, license., oom issue...could not find anything wrong and all were good.
So we tried a rolling restrat of the SH cluster ...that fixed the issue and errors were gone 🙂
Thank you for your time on helping with this..
The error message you provided indicates that search results might be incomplete due to the search process ending prematurely on the peer.
Check Peer Logs:
Look into the peer log files, specifically:
$SPLUNK_HOME/var/log/splunk/splunkd.log
The search.log for the particular search.
Examine these logs for any relevant error messages or clues about what caused the premature termination.
Memory and Resource Constraints:
Ensure that the peer has sufficient resources (CPU, memory, disk space) to handle the search workload.
Sometimes, insufficient resources can lead to premature search process termination.
Consider monitoring system resource usage during search execution.
License Considerations:
If you’re using a trial Splunk Enterprise distributed deployment, each instance must use its own self-generated Enterprise Trial license.
In contrast, a distributed deployment running a Splunk Enterprise license requires configuring a license master to host all licenses.
Check for OOM Killer Events:
Review /var/log/messages on the peer for any Out-of-Memory (OOM) Killer events.
Insufficient memory can cause processes to terminate unexpectedly.
Increase ulimits for Open Files:
If you haven’t already, consider increasing the ulimits for open files on the indexers.
For example, set the ulimit to the recommended 64000 (initially it might be set to 4096).
Review Configuration:
Verify that the configuration of your search head, indexers, and forwarders is correct.
Ensure that the search head can communicate with the peer properly.
Remember to investigate the specific details in the logs to pinpoint the root cause. If you encounter any specific error messages or need further assistance, feel free to share additional details.
Solved: Search results might be incomplete: the search pro... - Splunk Community