Splunk Enterprise

Large CSV ingest

willadams
Contributor

I am having trouble on some monitored CSV's that get refreshed daily.  There are 5 CSV's in a common directory that I have a monitored input configured for.  The maximum columns for the 5 CSV's is 68 columns.  The file sizes are typically 1.5MB to 2MB with one file being 22MB.  The largest number of rows in one particular file is roughly 39000 rows with the smallest being 1500 rows.  I have noted that SPLUNK will ingest the CSV but caps out at around the 150th record for each file.  For the 22MB file (which has 39000 rows) it doesn't seem to read the file (this may be due to some columns that look weird so may be bombing out).  Checked limits.conf for column number for ingest and it is configured at around 200 (due to a reason I thought I might need it later down the track) but even with the default limits.conf, the number of columns is not exceeding the default value.

Labels (1)
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...