Using Splunk 5.0.1 on a Windows box.
My experience is that oracle creates many, many of these little xml audit files. Currently we have over 1.3 billion events over the course of a year or so. Our installation of Splunk has had no problem keeping up with tens of thousands of files per day, across multiple database servers.
If there is a hiccup in service for whatever reason, the system has caught up in less than a day.
On the indexer I have modified the inputs.conf file with batch stanzas with the "move_policy = sinkhole" option so that the files don't fill up the audit directory I have configured via oracle - instead, Splunk will read the xml file and then delete it. You will notice lots of error messages, in say, S.O.S. app, because Splunk will try to access xml files even as they are in the process of being written by Oracle. This is normal. We have had a few instances where Splunk has successfully removed a file before Oracle was done with it, but this is relatively rare.
Performance-wise is again dependent on your database activity. We have about 1GB indexing per day and it can be sluggish on the reporting side. We ended up moving all the logins and logouts to a separate index.
For reporting, we specify the sourcetype in the batch stanza mentioned above. There are a number of ways you can handle xml input, I have simply set up a few custom field extractions.
In the end I would prefer Oracle to be able to write to a custom Windows log, set to overwrite itself when it fills up. Unfortunately Oracle has instead made it so that when the Windows log option is set, it just points to the Application log. With our volume it essentially overwrites itself so quickly that all you see is oracle logon/logoff data, and little else of consequence in a Windows environment.
Just my 2 cents.
... View more