Hi all,
I'm looking for a way to copy all of the logging from an index to an S3 bucket on my company account.
Ideally, I would like to:
1) Filter out which data in the index gets copied over.
2) Copy all the data within a date range.
3) Store the data in a raw text or CSV format.
Do you have any tools or documentation around how we could do this?
Your question is a bit confusing. It's not obvious whether you want to copy existing, already indexed data or do you want to do duplicate incoming data as it is being ingested? Or maybe you want to copy indexed data but do it periodically instead of a one-off operation.
You may want to check out the ExportEverything app at https://splunkbase.splunk.com/app/5738
This isnt something that is necessarily going to be a simple off-the-shelf task and might ultimately depend on the scale of data we are talking about - do you know roughly how many GB/TB of data you would be looking to export?
One way you could achieve this is by creating a search that will select the data you want to move to S3 and then either use the CLI (https://help.splunk.com/en/splunk-cloud-platform/search/search-manual/10.0.2503/export-search-result...), Web UI (https://help.splunk.com/en/splunk-enterprise/search/search-manual/9.1/export-search-results/export-d...) or REST API (https://help.splunk.com/en/splunk-enterprise/search/search-manual/9.3/export-search-results/export-d...)
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing