Getting Data In

Is there a way to export results to multiple CSVs with a fixed number of events per file?

New Member

I have a search that returns around 60,000 results in a straight table format:
field1, field2

I need to export this via CSV to another system, that only accepts 1000 lines per CSV.
Is there a way to export these results to multiple CSV's, capped at 1000 events per CSV?

example:
results1.csv (1-1000)
results2.csv (1001-2000)
resultsn.csv (n...)

Note: I cannot split based upon time, as we are running a daily stats, then de-duping these results, before export.
I am also trying to avoid running multiple searches for each 1000 events!

Thanks all!!!

0 Karma
1 Solution

SplunkTrust
SplunkTrust

Try something like this

| gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map search="search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$ "

View solution in original post

0 Karma

SplunkTrust
SplunkTrust

Try something like this

| gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map search="search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$ "

View solution in original post

0 Karma

New Member

Nice, I was playing with the eval command but in a different approach. Very nice!

0 Karma

New Member

Tweaked your search:

     | gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map [search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$]

This seems to work, thanks again!

0 Karma