Getting Data In

Elastic Integrator App

Rafaelled
Engager

Hello, I have been trying to migrate elk data to splunk, we have elk data dating back 2 years and I have attempted to use the elastic integrator app from splunk base. I was able to set it up with SSL and its bringing logs in from the past 30 days. The issue I have is that if I try to change the timeframe in the inputs.conf it will not work, and if I try to use a wildcard for the indice it will not work as well. Has anyone found a way around this? I am also open to hearing any other suggestions to get old elk data into splunk, thank you. 



#https://splunkbase.splunk.com/app/4175

Labels (3)
0 Karma
1 Solution

tscroggins
Influencer

Hi @Rafaelled,

Both parameters should work. See my previous post at https://community.splunk.com/t5/Getting-Data-In/integrating-Splunk-with-Elasticsearch/m-p/696647/hig... for a few limitations.

Depending on the number and size of documents you need to migrate, the app may not be appropriate. A custom REST input would give you the most flexibility with respect to the Elasticsearch Search API.

There are pre-written tools like https://github.com/elasticsearch-dump/elasticsearch-dump that may help.

If you have a place to host it, an instance of Logstash and a configuration that combines an elasticsearch input with an http output (for Splunk HEC) would be relatively easy to manage.

If you don't have a large amount of data or if you're willing to limit yourself to 1 TB per day, a free Cribl Stream license could also do the job.

I'm happy to help brainstorm relatively simple solutions here.

View solution in original post

0 Karma

tscroggins
Influencer

Hi @Rafaelled,

Both parameters should work. See my previous post at https://community.splunk.com/t5/Getting-Data-In/integrating-Splunk-with-Elasticsearch/m-p/696647/hig... for a few limitations.

Depending on the number and size of documents you need to migrate, the app may not be appropriate. A custom REST input would give you the most flexibility with respect to the Elasticsearch Search API.

There are pre-written tools like https://github.com/elasticsearch-dump/elasticsearch-dump that may help.

If you have a place to host it, an instance of Logstash and a configuration that combines an elasticsearch input with an http output (for Splunk HEC) would be relatively easy to manage.

If you don't have a large amount of data or if you're willing to limit yourself to 1 TB per day, a free Cribl Stream license could also do the job.

I'm happy to help brainstorm relatively simple solutions here.

0 Karma

Rafaelled
Engager

Thank you @tscroggins,
We have 2 logstash servers so I took one of them and made a to conf file that sends data from elastic to splunk via hec. Only issue now is logstash is running out of heap memory due to the size of the transfers. Working on fixing the pipeline now. Thanks again for the suggestions!

Get Updates on the Splunk Community!

What's New in Splunk Cloud Platform 9.3.2411?

Hey Splunky People! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2411. This release ...

Buttercup Games: Further Dashboarding Techniques (Part 6)

This series of blogs assumes you have already completed the Splunk Enterprise Search Tutorial as it uses the ...

Technical Workshop Series: Splunk Data Management and SPL2 | Register here!

Hey, Splunk Community! Ready to take your data management skills to the next level? Join us for a 3-part ...