Getting Data In

Is there a way to export results to multiple CSVs with a fixed number of events per file?

junshi
Explorer

I have a search that returns around 60,000 results in a straight table format:
field1, field2

I need to export this via CSV to another system, that only accepts 1000 lines per CSV.
Is there a way to export these results to multiple CSV's, capped at 1000 events per CSV?

example:
results1.csv (1-1000)
results2.csv (1001-2000)
resultsn.csv (n...)

Note: I cannot split based upon time, as we are running a daily stats, then de-duping these results, before export.
I am also trying to avoid running multiple searches for each 1000 events!

Thanks all!!!

0 Karma
1 Solution

somesoni2
Revered Legend

Try something like this

| gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map search="search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$ "

View solution in original post

0 Karma

somesoni2
Revered Legend

Try something like this

| gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map search="search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$ "
0 Karma

junshi
Explorer

Nice, I was playing with the eval command but in a different approach. Very nice!

0 Karma

junshi
Explorer

Tweaked your search:

     | gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map [search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$]

This seems to work, thanks again!

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...