- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

I have used other NMS systems to pipe to Splunk before for historical reporting. And I am sure that it can be done with Tivoli, but I am not sure where to start.
Right now, the use case is to index Interface statistics into splunk for reporting.
If anyone can help it would AWESOME.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

The Splunk for Tivoli Netcool App is designed to forward all data from the object server into Splunk. Polls, traps, syslog, CORBA, socket, TL1, and the myriad other probes that feed the object server can all be Splunk'd. After they have been processed, you can determine your retention policy and archive old data as outlined here http://docs.splunk.com/Documentation/Splunk/latest/Indexer/Automatearchiving
In a nutshell you need to install the app on your Splunk platform, install the Splunk UF and Tech-Add ons onto your object server, and configure the flat file gateway (nco_g_file) to write the events.
- Mark as New
- Bookmark Message
- Subscribe to Message
- Mute Message
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

The Splunk for Tivoli Netcool App is designed to forward all data from the object server into Splunk. Polls, traps, syslog, CORBA, socket, TL1, and the myriad other probes that feed the object server can all be Splunk'd. After they have been processed, you can determine your retention policy and archive old data as outlined here http://docs.splunk.com/Documentation/Splunk/latest/Indexer/Automatearchiving
In a nutshell you need to install the app on your Splunk platform, install the Splunk UF and Tech-Add ons onto your object server, and configure the flat file gateway (nco_g_file) to write the events.
