I have a python script to call the api everyday . Whenever the script runs , it calls the api and fetches all time data . The issue i have here is to avoid duplicate data when it gets ingested into splunk.I have a unique field in the json data to avoid the duplicate entries .How do i configure that in props.conf ? Can you please help
Hi @hashsplunk,
It is not possible to check duplicate new events with already indexed events. You can do this in your script, keep the latest unique field in a file and send only events after this event to Splunk.