All Apps and Add-ons

Avoid duplicate entries of json data into splunk

hashsplunk
Loves-to-Learn Lots

I have a python script to call the api everyday . Whenever the script runs , it calls the api and fetches all time data . The issue i have here is to avoid duplicate data when it gets ingested into splunk.I have a unique field in the json data to avoid the duplicate entries .How do i configure that in props.conf ? Can you please help

Labels (1)
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @hashsplunk,

It is not possible to check duplicate new events with already indexed events. You can do this in your script, keep the latest unique field in a file and send only events after this event to Splunk.

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma
Get Updates on the Splunk Community!

OpenTelemetry for Legacy Apps? Yes, You Can!

This article is a follow-up to my previous article posted on the OpenTelemetry Blog, "Your Critical Legacy App ...

UCC Framework: Discover Developer Toolkit for Building Technology Add-ons

The Next-Gen Toolkit for Splunk Technology Add-on Development The Universal Configuration Console (UCC) ...

.conf25 Community Recap

Hello Splunkers, And just like that, .conf25 is in the books! What an incredible few days — full of learning, ...