Splunk Search

dboutput with 65 Million Rows

ShaneNewman
Motivator

I need to output 65 Million rows to a database table, I see the default per transaction is 50K. Is there a good way to do this?

0 Karma
1 Solution

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

View solution in original post

koshyk
Super Champion

Via splunk create the file into a csv file. (Could split up into 1 million records for better performance.)

Then load using native DB commands?

DB2 command something like:
db2 import from csvTypeFile of del "insert into table1 (c1, c2, c3,...) "

Oracle use SQL_Loader: http://www.orafaq.com/wiki/SQL*Loader_FAQ

0 Karma

ShaneNewman
Motivator

The database does not have the data, I am trying to put it in the database.

0 Karma

linu1988
Champion

why can't you do at database end? Splunk is not as fast as the database.

0 Karma

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

ShaneNewman
Motivator

I have 24GB allocated to dbx, 48GB overall on the server.

0 Karma

ShaneNewman
Motivator

Just FYI, I tried streaming and it filled up all of the RAM on my server...

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

March Community Office Hours Security Series Uncovered!

Hello Splunk Community! In March, Splunk Community Office Hours spotlighted our fabulous Splunk Threat ...

Stay Connected: Your Guide to April Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars in April. This post ...