Here's a gzipped dump of everything past a certain timestamp that you run from the linux command line:
sudo /opt/splunk/bin/splunk search "sourcetype=apache_access _time > 1335337200" -preview 0 -maxout 0 -output rawdata | gzip > access_custom.apr-may2012.gz
This is a good scripting approach to export large search results.
This is another example of scripting: splunk search "index=_internal earliest=09/14/2014:23:59:00 latest=09/16/2014:01:00:00 " -output rawdata -maxout 200000 > c:/test123.dmp
Yeah agreed, the "table _raw" solution did not work for me at all in 6.2.0, it looks like it would by populating stats but when I hit "export" then did csv it just gave me a file of timestamps.
Following the CLI export example though got it done.
Perform your search on required sourcetype(s) and host(s)
Then navigate to Export -> Export Results
Choose Format=Raw Events and click on "Export" to save a txt file of the raw events.
There is "Max # of results to export " option where you can select "unlimited"
You can do something like this to roughly achieve what you are trying to do via Splunk Web.
Replace sourcetype and host with your actual search values.
sourcetype=foo host=goo | table _raw | outputcsv rawdump.csv
The file will get written to $SPLUNK_HOME/var/run/splunk
But I still need access to that location on the spunk server? Seems like it would be a simple thing for slunk to be able to do. Often times its nessicary to send logs to the third party app developers so that they cam diagnose issues.