Getting Data In

Elastic Integrator App

Rafaelled
Engager

Hello, I have been trying to migrate elk data to splunk, we have elk data dating back 2 years and I have attempted to use the elastic integrator app from splunk base. I was able to set it up with SSL and its bringing logs in from the past 30 days. The issue I have is that if I try to change the timeframe in the inputs.conf it will not work, and if I try to use a wildcard for the indice it will not work as well. Has anyone found a way around this? I am also open to hearing any other suggestions to get old elk data into splunk, thank you. 



#https://splunkbase.splunk.com/app/4175

Labels (3)
0 Karma
1 Solution

tscroggins
Influencer

Hi @Rafaelled,

Both parameters should work. See my previous post at https://community.splunk.com/t5/Getting-Data-In/integrating-Splunk-with-Elasticsearch/m-p/696647/hig... for a few limitations.

Depending on the number and size of documents you need to migrate, the app may not be appropriate. A custom REST input would give you the most flexibility with respect to the Elasticsearch Search API.

There are pre-written tools like https://github.com/elasticsearch-dump/elasticsearch-dump that may help.

If you have a place to host it, an instance of Logstash and a configuration that combines an elasticsearch input with an http output (for Splunk HEC) would be relatively easy to manage.

If you don't have a large amount of data or if you're willing to limit yourself to 1 TB per day, a free Cribl Stream license could also do the job.

I'm happy to help brainstorm relatively simple solutions here.

View solution in original post

0 Karma

tscroggins
Influencer

Hi @Rafaelled,

Both parameters should work. See my previous post at https://community.splunk.com/t5/Getting-Data-In/integrating-Splunk-with-Elasticsearch/m-p/696647/hig... for a few limitations.

Depending on the number and size of documents you need to migrate, the app may not be appropriate. A custom REST input would give you the most flexibility with respect to the Elasticsearch Search API.

There are pre-written tools like https://github.com/elasticsearch-dump/elasticsearch-dump that may help.

If you have a place to host it, an instance of Logstash and a configuration that combines an elasticsearch input with an http output (for Splunk HEC) would be relatively easy to manage.

If you don't have a large amount of data or if you're willing to limit yourself to 1 TB per day, a free Cribl Stream license could also do the job.

I'm happy to help brainstorm relatively simple solutions here.

0 Karma

Rafaelled
Engager

Thank you @tscroggins,
We have 2 logstash servers so I took one of them and made a to conf file that sends data from elastic to splunk via hec. Only issue now is logstash is running out of heap memory due to the size of the transfers. Working on fixing the pipeline now. Thanks again for the suggestions!

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...