Getting Data In

Is there a way to export results to multiple CSVs with a fixed number of events per file?

junshi
Explorer

I have a search that returns around 60,000 results in a straight table format:
field1, field2

I need to export this via CSV to another system, that only accepts 1000 lines per CSV.
Is there a way to export these results to multiple CSV's, capped at 1000 events per CSV?

example:
results1.csv (1-1000)
results2.csv (1001-2000)
resultsn.csv (n...)

Note: I cannot split based upon time, as we are running a daily stats, then de-duping these results, before export.
I am also trying to avoid running multiple searches for each 1000 events!

Thanks all!!!

0 Karma
1 Solution

somesoni2
Revered Legend

Try something like this

| gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map search="search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$ "

View solution in original post

0 Karma

somesoni2
Revered Legend

Try something like this

| gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map search="search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$ "
0 Karma

junshi
Explorer

Nice, I was playing with the eval command but in a different approach. Very nice!

0 Karma

junshi
Explorer

Tweaked your search:

     | gentimes start=-1 | eval sno=mvrange(0,60) | table sno | mvexpand sno | eval from=sno*1000+1 | eval to=(sno+1)*1000 | map [search your search to export | eval sno=1 | accum sno | where sno>=$from$ AND sno<=$to$ | fields - sno |  outputcsv Result.$from$"-".$to$]

This seems to work, thanks again!

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...