Long story short I'd like to know if it's possible to pass search results through a script to another system (HP's Operations Orchestration in this case)? I've seen from this link:
That you can pass just about everything but the actual results through. It looks like the best thing to do is have HP OO go and read the file that splunk creates, but it's a gz file isn't it? I think OO can read csv, but not so sure about gz.
Any suggestions on how to do this? Here's our use case:
Splunk runs a scheduled search on /32 route withdrawn messages. This catches circuit bounces for to any non-redundant sites and forwards them on (currently) to our HP Network Node Manager iseries application. Problem is, all that does is bascially say "Hey go look at splunk/your email to see why I alerted."
With HP OO, I am able to directly generate an incident in HP NNMi complete with correct source node and everything. HP OO is also able to parse data before it does this, which I'd like it to do. But it needs the data in the first place.
Here is a simple python script that can be used to process the results:
#!/usr/bin/python import sys,os,re,datetime,gzip,csv count,search,fq_search,title,reason,url,not_used,result_file = sys.argv[1:9] f = gzip.open(result_file) csv = csv.DictReader(f) for row in csv: # process result here - use row['fieldname'] to access the fields pass
We've done this to get traps to our fault management system - EMC Smarts.
We execute a perl script referencing $8 which is the results file path. We do a zcat to expand the file and then use the Text::xSV module to parse the resulting csv so that we can break the fields into variables that can be mapped to varbinds passed as part of the net-snmp sendtrap command.
Your script can open and read the file, take whatever results or fields it needs, and then pass them on to HP OO using whatever mechanism is available. Note that you're given the file because there is no way for us to know ahead of time how many results there will be or what fields will be there.