All Apps and Add-ons

SNMP message serialization error snmp_stanza

jadengoho
Builder

Just want to ask why did I get this error using SNMP modular input? Or how can I solve this?
I am trying to poll the data.
Does it really don't get data or an error on my configuration on testing polling_*

        5/9/18 10:19:04.989 PM  05-09-2018 22:19:04.989 -0400
             ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/snmp_ta/bin/snmp.py" SNMP message serialization error snmp_stanza:snmp://testingpolling_1
                • host = splunk_hf1
                • source = /opt/splunk/var/log/splunk/splunkd.log
                • sourcetype = splunkd

    05-09-2018 22:18:01.756 -0400 
    ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/snmp_ta/bin/snmp.py" No SNMP response received before timeout snmp_stanza:snmp://testingpolling_2

 5/9/18  10:18:04.891 PM    05-09-2018 22:18:04.891 -0400 
           ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/snmp_ta/bin/snmp.py" SNMP message serialization error snmp_stanza:snmp://testingpolling_2
                    • host = splunk_hf1
                    • source = /opt/splunk/var/log/splunk/splunkd.log
                    • sourcetype = splunkd
0 Karma
1 Solution

xpac
SplunkTrust
SplunkTrust

It simply means that you're not getting any response for your poll attempt.
This can be due to bad config on your side, on the other side, a firewall issue, any other networking issue, etc. etc.

You might be able to troubleshoot this using tcpdump and watching for any kind of responses from the target system(s), but this is almost certain not a Splunk issue (unless you configured wrong IPs/hostnames, of course).

View solution in original post

0 Karma

Damien_Dallimor
Ultra Champion

I'd guess that it's most likely a network/firewall blocking issue.

0 Karma

jadengoho
Builder

Yes i think so , Cause it's a secured network and some port/address are not allowed to be access. Thanks for the answerr 🙂

0 Karma

xpac
SplunkTrust
SplunkTrust

It simply means that you're not getting any response for your poll attempt.
This can be due to bad config on your side, on the other side, a firewall issue, any other networking issue, etc. etc.

You might be able to troubleshoot this using tcpdump and watching for any kind of responses from the target system(s), but this is almost certain not a Splunk issue (unless you configured wrong IPs/hostnames, of course).

0 Karma

jadengoho
Builder

can you tell me how to do the "tcpdump "?

0 Karma

xpac
SplunkTrust
SplunkTrust

Start with tcpdump -nn -i yourinterfacehere host 1.2.3.4 where 1.2.3.4 is the IP address of the system you're trying to poll.
Tty it with that, there is also plenty of docs on tcpdump available on the net. Good luck!

0 Karma

jadengoho
Builder

thank you very much , let me do the "tcpdump" and "snmpwalk " to check what is the real problem, Thanks:)

0 Karma

jadengoho
Builder

Now im getting more of this message :

 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/snmp_ta/bin/snmp.py" No SNMP response received before timeout snmp_stanza:snmp://testingpolling_2
0 Karma

pgadhari
Builder

@jadengoho - were you able to resolve the above issue regarding "NO response". I am also getting the same issues - can you tell what you did to resolve this issue ?

0 Karma

jadengoho
Builder

Yes - apparently this is network firewall and connection issue.
I asked our network team to solve this and they adjust things in the network for the server and splunk forwarder to communicate

Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...