Hello, I have been trying to migrate elk data to splunk, we have elk data dating back 2 years and I have attempted to use the elastic integrator app from splunk base. I was able to set it up with SSL and its bringing logs in from the past 30 days. The issue I have is that if I try to change the timeframe in the inputs.conf it will not work, and if I try to use a wildcard for the indice it will not work as well. Has anyone found a way around this? I am also open to hearing any other suggestions to get old elk data into splunk, thank you.
Hi @Rafaelled,
Both parameters should work. See my previous post at https://community.splunk.com/t5/Getting-Data-In/integrating-Splunk-with-Elasticsearch/m-p/696647/hig... for a few limitations.
Depending on the number and size of documents you need to migrate, the app may not be appropriate. A custom REST input would give you the most flexibility with respect to the Elasticsearch Search API.
There are pre-written tools like https://github.com/elasticsearch-dump/elasticsearch-dump that may help.
If you have a place to host it, an instance of Logstash and a configuration that combines an elasticsearch input with an http output (for Splunk HEC) would be relatively easy to manage.
If you don't have a large amount of data or if you're willing to limit yourself to 1 TB per day, a free Cribl Stream license could also do the job.
I'm happy to help brainstorm relatively simple solutions here.
Hi @Rafaelled,
Both parameters should work. See my previous post at https://community.splunk.com/t5/Getting-Data-In/integrating-Splunk-with-Elasticsearch/m-p/696647/hig... for a few limitations.
Depending on the number and size of documents you need to migrate, the app may not be appropriate. A custom REST input would give you the most flexibility with respect to the Elasticsearch Search API.
There are pre-written tools like https://github.com/elasticsearch-dump/elasticsearch-dump that may help.
If you have a place to host it, an instance of Logstash and a configuration that combines an elasticsearch input with an http output (for Splunk HEC) would be relatively easy to manage.
If you don't have a large amount of data or if you're willing to limit yourself to 1 TB per day, a free Cribl Stream license could also do the job.
I'm happy to help brainstorm relatively simple solutions here.
Thank you @tscroggins,
We have 2 logstash servers so I took one of them and made a to conf file that sends data from elastic to splunk via hec. Only issue now is logstash is running out of heap memory due to the size of the transfers. Working on fixing the pipeline now. Thanks again for the suggestions!