All Apps and Add-ons

Errno 32 Broken pipe in Hydra worker log

Smile172
Explorer

Hi,

I have set up the app, with a heavy forwarder, but sometimes data comes in, sometimes not.
I found some errors in the Hydra worker's log:
ERROR [ta_ontap_collection_worker://gamma:6012] [ProcessorPerfHandler] Problem collecting processor from server=netapp.domain : [Errno 32] Broken pipe
Traceback (most recent call last):
File "/opt/splunk/etc/apps/Splunk_TA_ontap/bin/ta_ontap/handlers.py", line 93, in runPerf
time=datetime.datetime.fromtimestamp(int(results[object_name]['timestamp'])))
File "/opt/splunk/etc/apps/SA-Hydra/bin/hydra/init.py", line 226, in sendData
self.out.flush()
IOError: [Errno 32] Broken pipe
ERROR [ta_ontap_collection_worker://gamma:6012] [MegaPerfHandler] failed sub job run on sub_handler= server=netapp.domain

Forwarder's OS version: Ubuntu 14.04 LTS

Thank you!

1 Solution

Smile172
Explorer

halr9000, thank you for the answer!

After I disabled IPv6 on the heavy forwarder, the error message disappeared and the data flows seamlessly.

View solution in original post

0 Karma

Smile172
Explorer

halr9000, thank you for the answer!

After I disabled IPv6 on the heavy forwarder, the error message disappeared and the data flows seamlessly.

0 Karma

Smile172
Explorer

Sometimes I have seen data in NetApp app sometimes not. At the beginning the heavy forwarder had 2 IP-s, 1 IPv4 and 1 IPv6 address. I disabled the IPv6 to force binding to IPv4 address and the connection restored 🙂

0 Karma

halr9000
Motivator

Glad that worked out. What led you to that solution? I think some more detail in this answer would be helpful to others who may come after you with the same issue. TIA

0 Karma

halr9000
Motivator

Can you try this on RHEL or CentOS? Those are the only supported platforms for a DCN according to the docs (http://docs.splunk.com/Documentation/NetApp/latest/DeployNetapp/Platformandhardwarerequirements#Splu...). If it's still broken there, then open a support case so that we can get your diag logs.

0 Karma
Get Updates on the Splunk Community!

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Community Feedback

We Want to Hear from You! Share Your Feedback on the Splunk Community   The Splunk Community is built for you ...

Manual Instrumentation with Splunk Observability Cloud: Implementing the ...

In our observability journey so far, we've built comprehensive instrumentation for our Worms in Space ...