All Apps and Add-ons

SNMP Splunk MA App for Netcool is not sending traps

HansK
Path Finder

We have installed "SNMP Splunk MA App for Netcool" on a new search head and linked the search head to the indexers.

An alert with the trigger action "Netcool Custom Modular Alert"has been created and all fieds have been filled.

We see a log entry in /opt/splunk/var/log/splunk/netcool_custom_modular_alert.log:
2019-01-28 07:59:02,230 INFO START
2019-01-28 07:59:02,230 INFO splunkapp:search, splunksearch:test_xxxxxx, snmp_serverip:172.22.171.164, snmp_port:162, snmp_community:xxxxx, snmp_hostname:, snmp_alertmessage:More than 1 release cause 5XX in last 30 minutes for customer xxxx xxxxx, snmp_severity:5, splunk_escalation:xxxxxxxx, splunk_payload:{u'configuration': {u'hostname': u'', u'enterpriseSNMPSpecificObjectID': u'9', u'customtext': u'', u'AlertKey': u'123456789', u'community': u'public', u'alertmessage': u'More than 1 release cause 5XX in last 30 minutes for customer xxxxxxxxxt', u'enterpriseSNMPObjectID': u'1.2.3.4.5.6.7.8', u'enterpriseSNMPSpecificTrapID': u'10', u'serverip': u'172.22.171.164:162', u'escalation': u'xxxxxxxxx', u'severity': u'5'}, u'results_link': u'http://xxxxxxxxx:8000/app/search/search?q=%7Cloadjob%20scheduler__admin__search__RMD510cd368a33d67d8...', u'server_uri': u'https://127.0.0.1:8089', u'results_file': u'/opt/splunk/var/run/splunk/dispatch/scheduler_adminsearchRMD510cd368a33d67d83_at_1548658740_9899/per_result_alert/tmp_0.csv.gz', u'result': {u'count(Q21_sip_dest_respcode)': u'2'}, u'sid': u'scheduleradminsearch_RMD510cd368a33d67d83_at_1548658740_9899', u'search_name': u'test_xxxxxxxl', u'server_host': u'xxxxxxxxxxxxx', u'search_uri': u'/servicesNS/nobody/search/saved/searches/test_xxxxxxx', u'session_key': u'd0X5lh5dXc7S9uTW^82E4eQ1l9z6jQpVjKhTm3xczVgILSEZjkVRvHf6z2QXOvv9MR197IjzD5_50uJ0anuIwvZwuYFGTcSmBuuI^L9QsYNPmwZKFplYgJPy8VbVPC^i1W82Gfvt8FY', u'app': u'search', u'owner': u'admin'}, splunk_customtext:
2019-01-28 07:59:02,272 INFO STOP

We receive nothing at the dest ip 172.22.171.164:162

Also nothing is seen on the wire with tcpdump (other snmp sending procs on this server do work and are seen on the wire):
tcpdump -i any -s 0 host 172.22.171.164 and port 162

0 Karma
1 Solution

HansK
Path Finder

Well found the answer:
In the alert it says "Specify Your Unique Enterprise OID. Example: 1.2.3.4.5.6.7.8"

When using this sample OID nothing gets sent out, when you change it to a valid OID like 1.3.6.1.2.1.43.18.2.0.1 it starts working.

View solution in original post

0 Karma

HansK
Path Finder

Well found the answer:
In the alert it says "Specify Your Unique Enterprise OID. Example: 1.2.3.4.5.6.7.8"

When using this sample OID nothing gets sent out, when you change it to a valid OID like 1.3.6.1.2.1.43.18.2.0.1 it starts working.

0 Karma

wluca
New Member

well done HansK

0 Karma

petom
Path Finder

Is there a firewall between your search head and Netcool SNMP probe?
Either local, OS firewall (iptables) or external in the path between the search head and the probe.

0 Karma

HansK
Path Finder

There are no firewalls.
But the problem is there is nothing leaving the machine:
If i send a snptrap on the cli (snmptrap -c public -v 2c 172.22.171.164 "" 1.3.6.1.2.1.43.18.2.0.1) it gets caught by : tcpdump -i any -s 0 host 172.22.171.164 and port 162

None of the alerts using the netcool app send anything , although logs do appear in /opt/splunk/var/log/splunk/netcool_custom_modular_alert.log, tcpdump locally sees nothing going out.

0 Karma

HansK
Path Finder

OS Centos7
Splunk version 7.2.3

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...