I am having issues with finding a way to export two reports.
I have two reports, which I'll call search1 and search2. Both searches were run, then ran in the background. According to the jobs tab, both searches completed. The customer wanted this search run for "all-time" and thus is quite large. Search1 is 9.22GB and Search2 is 4.97GB.
The issue is getting access to the logs.
I've tried using | loadjob sid, and it just hangs and fails.
I've tried exporting from the jobs tab, and it fails.
I can't use the api, because from what I can tell, you must put the password into the search, when then makes the password searchable for anyone with access to that log.
I went to the $SPLUNK_HOME/var/run/splunk/dispatch folder and found both jobs where this link, https://docs.splunk.com/Documentation/Splunk/8.2.1/Troubleshooting/CommandlinetoolsforusewithSupport... says to run "splunk cmd splunkd toCsv ./results.srs.gz". the .gz file appears to now be .zst, but I ran the command.
Search1 after a while simply said "killed".
Search2 as I'm writing this appears to be working, as it appears comma delimited text is scrolling on the console. I assume that once changed, I will be able to export this one.
So how do I export Search1 and other large files in the future? The toCsv command was the last thing I found to try. Perhaps there is a setting in a .conf file I can modify and then run something else? Any assistance is appreciated.
Consider running multiple searches over smaller time ranges and then combining the results.
Hi @XOJ,
dump command may help you to export a large amount of data.
https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Dump
Below will create daily dump files
index=yourindex | eval _dstpath=strftime(_time, "%Y%m%d") | dump basefilename=search1
Consider running multiple searches over smaller time ranges and then combining the results.
I hate that this is the answer. People have businesses much bigger than ours, and even they have to make tiny searches?
That being said, you are the only one that gave an answer, so I will mark it as such.