Splunk Search

dboutput with 65 Million Rows

ShaneNewman
Motivator

I need to output 65 Million rows to a database table, I see the default per transaction is 50K. Is there a good way to do this?

0 Karma
1 Solution

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

View solution in original post

koshyk
Super Champion

Via splunk create the file into a csv file. (Could split up into 1 million records for better performance.)

Then load using native DB commands?

DB2 command something like:
db2 import from csvTypeFile of del "insert into table1 (c1, c2, c3,...) "

Oracle use SQL_Loader: http://www.orafaq.com/wiki/SQL*Loader_FAQ

0 Karma

ShaneNewman
Motivator

The database does not have the data, I am trying to put it in the database.

0 Karma

linu1988
Champion

why can't you do at database end? Splunk is not as fast as the database.

0 Karma

jcoates_splunk
Splunk Employee
Splunk Employee

Streaming has no limit, but if you're not in streaming mode you have a 50k row limit.

How much memory? Please open a bug if it's at or over recommended minimum (12gb IIRC)

ShaneNewman
Motivator

I have 24GB allocated to dbx, 48GB overall on the server.

0 Karma

ShaneNewman
Motivator

Just FYI, I tried streaming and it filled up all of the RAM on my server...

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

New This Month - Observability Updates Give Extended Visibility and Improve User ...

This month is a collection of special news! From Magic Quadrant updates to AppDynamics integrations to ...

Intro to Splunk Synthetic Monitoring

In our last post, we mentioned that the 3 key pieces of observability – metrics, logs, and traces – provide ...