I need to output 65 Million rows to a database table, I see the default per transaction is 50K. Is there a good way to do this?
Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.
How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)
Via splunk create the file into a csv file. (Could split up into 1 million records for better performance.)
Then load using native DB commands?
DB2 command something like:
db2 import from csvTypeFile of del "insert into table1 (c1, c2, c3,...) "
Oracle use SQL_Loader: http://www.orafaq.com/wiki/SQL*Loader_FAQ
The database does not have the data, I am trying to put it in the database.
why can't you do at database end? Splunk is not as fast as the database.
Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.
How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)
I have 24GB allocated to dbx, 48GB overall on the server.
Just FYI, I tried streaming and it filled up all of the RAM on my server...