Getting Data In

What is the best way to ingest 300gb csv to splunk?

Path Finder

I tried ingesting it using add oneshot then midway through it, splunk suddenly stops.

Aside from splitting the file, is there a way to ingest this file without splunk stopping suddenly?

Can splunk monitor three 200gb+ files at the same time?

Thank you.

Regards,
Raj

0 Karma

Champion

Please tell me the structure of indexers.
The amount of log processed by one indexer per day is said to be 300GB.

0 Karma

Champion

What is the reason for using a heavy forwarder?

Splunk is not good at fetching large files at once with default settings.

The problem is that while importing a file, one pipeline monopolizes processing.

It is necessary not to concentrate on one indexer as well as the pipeline that detects and captures logs.

It is important to investigate which process is the problem.
https://wiki.splunk.com/Community:HowIndexingWorks
https://docs.splunk.com/Documentation/Splunk/7.2.5/Indexer/Pipelinesets
https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs

0 Karma

Path Finder

Hi, thank you for input.

A cluster with three indexer.
I'm using a heavy forwarder to process the CSVs to be sent to the cluster. So is it not possible to ingest that large file?

Regards,
Raj

0 Karma