I have a bunch of stored procedures in SQL which need to run at a particular interval and return results (They basically look for issues in different SQL tables). I need to create and alert based on the collated results of all stored procedures and send them to a group of people. I was thinking of using Splunk DB Connect for this but it seems it is currently not meant for distributed architecture (as it runs from Search head directly) alos our datbase tables being high transaction tables it might lead to issues in scalability.
Other option what I could think of to run stored procedures in sequence and forward XML results to indexers. While firing the alerts we can use xml-kv to extract fields and use multisearch in Splunk 5 to collate different results and fire alerts based on timestamp. Is this the correct approach?
Do we have any other example/app out of the box which I can use for my requirement? Any other approach also would be helpful which is not expensive as I read xml-kv is an expensive (load wise) operation.
I realize this post hasn't been active for a while, but in case anyone else is looking for similar answers, try this: http://apps.splunk.com/app/1538. There are explanations on how to use the Oracle
UTL_TCP functions to send data directly to Splunk.
In this particular case, if generating alerts is the only object then it may make sense to generate them directly out of Oracle using the
UTL_SMTP packages. If there are other reasons to index the data (like reporting historical trends in alerts or transaction rates), then generating the alerts out of Splunk makes more sense.
Splunk DB Connect can run on a searchhead pool and send data to indexer servers. We are doing this currently. Also, you do not need to actually index data from a database to create alerts. DB Connect can simply search against a database (as if the data were residing on indexers) and alert just the same.