Getting Data In

How can I upload a very large csv file into Splunk?

nls7010
Path Finder

My customer has some very large csv files that are only updating about 200 events/rows of the data into Splunk.

"As per checking MDMS-001_Meter Reads Requested file have 4,633, but in Splunk it only have 208 events. "

How can I get the entire csv file to upload? I am using universal forwarders for the ingest.

0 Karma

to4kawa
Ultra Champion
| makeresults
| eval _raw="your_csv_copy_and_paste
....."
| multikv forceheader=1
| table your_csv_header

maybe, works.

Why not use collect OR outputcsv as needed?

0 Karma

DalJeanis
Legend

Those are pretty small files. I regularly upload csv files that are several million records without any issues.

You probably have a parsing error of some sort.

Get a sample file, and try a manual ingestion. If it is not resolving, that probably indicates that there is a formatting error with the csv file itself. Look for invalid data fields.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

4,633 is rows is not too large for a UF. Does the forwarder log any errors? Is the customer filtering any data from the file/sourcetype?

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Building Reliable Asset and Identity Frameworks in Splunk ES

 Accurate asset and identity resolution is the backbone of security operations. Without it, alerts are ...

Cloud Monitoring Console - Unlocking Greater Visibility in SVC Usage Reporting

For Splunk Cloud customers, understanding and optimizing Splunk Virtual Compute (SVC) usage and resource ...

Automatic Discovery Part 3: Practical Use Cases

If you’ve enabled Automatic Discovery in your install of the Splunk Distribution of the OpenTelemetry ...