I have a very large Oracle database table that is being used as a log sink for an application. There is high transaction throughput on this table. I would like to get the data in this table (not about this table) into Splunk as real time as possible. Unfortunately, I do not have access to the source in order to add a log sink directly to Splunk. I realize I could read the data and move it in batches, but I'm wondering if there are any less-intensive options such as transaction log replication. What is the best practice for moving this type of data?
AFAIK, the only method Splunk has for reading a SQL database is the DB Connect app.