Hello, I was trying and trying to export the data via REST API. I followed all the instructions from this thread:
But I see the jobs expire too soon when I export large data and I never get all the data I want because it sticks at 14% and 21%. I really don't know what to do. Is it a way to extend job expiration via curl or anything like that?
Hi, I did everything in that post and I couldn't extend the lifetime. I just entered in the Indexer and extended the job lifetime manually in "Job settings". It is enough to me but If you know another way more automatically I would appreciate it.
Hello @sbbadri. Could you be a bit more specific? I did not understand what you meant. I appreciated your help.
I retake this topic because I have more time to learn more about this.
Thank you everybody 😄
However if you want to export large amounts of data why not use the CLI?
curl -k -u admin:changeme https://localhost:8089/services/search/jobs/export -d search="search index=_internal earliest=-2s" -d output_mode=csv > ....(or similar)
I find the CLI interface much more efficient for large exports, the above will dump the data directly into a file...(which is probably what you are trying to do)
Hi garethatiag, I exported the data via CLI but I always had the problem with jobs expiration because time is too short. My solved was editing the job settings manually and extend the lifetime, it was ok at the moment I needed but If I want something more automatically it's not the best choice.
I changed the TTL in those files and I did not get what I wanted, when I used the CLI the job time expiration was too short. I used to export 200GB of data.