All Apps and Add-ons

Avoid duplicate entries of json data into splunk

hashsplunk
Loves-to-Learn Lots

I have a python script to call the api everyday . Whenever the script runs , it calls the api and fetches all time data . The issue i have here is to avoid duplicate data when it gets ingested into splunk.I have a unique field in the json data to avoid the duplicate entries .How do i configure that in props.conf ? Can you please help

Labels (1)
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @hashsplunk,

It is not possible to check duplicate new events with already indexed events. You can do this in your script, keep the latest unique field in a file and send only events after this event to Splunk.

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma
Get Updates on the Splunk Community!

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...

Splunk App for Anomaly Detection End of Life Announcement

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...