Splunk Search

dboutput with 65 Million Rows

ShaneNewman
Motivator

I need to output 65 Million rows to a database table, I see the default per transaction is 50K. Is there a good way to do this?

0 Karma
1 Solution

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

View solution in original post

koshyk
Super Champion

Via splunk create the file into a csv file. (Could split up into 1 million records for better performance.)

Then load using native DB commands?

DB2 command something like:
db2 import from csvTypeFile of del "insert into table1 (c1, c2, c3,...) "

Oracle use SQL_Loader: http://www.orafaq.com/wiki/SQL*Loader_FAQ

0 Karma

ShaneNewman
Motivator

The database does not have the data, I am trying to put it in the database.

0 Karma

linu1988
Champion

why can't you do at database end? Splunk is not as fast as the database.

0 Karma

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

ShaneNewman
Motivator

I have 24GB allocated to dbx, 48GB overall on the server.

0 Karma

ShaneNewman
Motivator

Just FYI, I tried streaming and it filled up all of the RAM on my server...

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...