Splunk Search

dboutput with 65 Million Rows

ShaneNewman
Motivator

I need to output 65 Million rows to a database table, I see the default per transaction is 50K. Is there a good way to do this?

0 Karma
1 Solution

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

View solution in original post

koshyk
Super Champion

Via splunk create the file into a csv file. (Could split up into 1 million records for better performance.)

Then load using native DB commands?

DB2 command something like:
db2 import from csvTypeFile of del "insert into table1 (c1, c2, c3,...) "

Oracle use SQL_Loader: http://www.orafaq.com/wiki/SQL*Loader_FAQ

0 Karma

ShaneNewman
Motivator

The database does not have the data, I am trying to put it in the database.

0 Karma

linu1988
Champion

why can't you do at database end? Splunk is not as fast as the database.

0 Karma

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

ShaneNewman
Motivator

I have 24GB allocated to dbx, 48GB overall on the server.

0 Karma

ShaneNewman
Motivator

Just FYI, I tried streaming and it filled up all of the RAM on my server...

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...