Having had an email back from the engineer, and looking through digital_shadows.py the app tries to delete all the logs in the index created i.e. [delete_existing_pipeline(self):
delete_query = 'search source=digital_shadows sourcetype=pipeline | delete']
then pull all the logs in again.
This delete cant be done when using a heavy forwarder to pull the logs as it wont be able to access/delete the index.
Also, am i right in saying that the delete command doesn't actually delete the data just makes it unsearchable? So storing a huge amount of replicated data.
Any ideas what the next move is other than to rewrite the app?
... View more