All Apps and Add-ons

DBConnect duplicates data

lodani
Engager

In DBConnect I configured a tailed database input and is working fine. I use a true incremental field to track inserts and updates. But every update is added to the index as duplicated events.

Is there any way to configure dbconnect to recognize updated rows by its primary key and avoid duplicates?

skender27
Contributor

Hi,

I resolved the almost the same issue (I am usong the 2.3 version of the app) by setting the ADVANCED tab setting for the rising column.
In fact you should put the '?' in the query to tell Splunk start the query (depending on the cron schedule) starting from a threshold (bigger or smaller value) so you will not have the same results and duplicates any time the query is executed.

Try reading the user guide of the app dbxconnect:
http://docs.splunk.com/Documentation/DBX/2.3.0/DeployDBX/HowSplunkDBConnectworks

Have a nice day,
Skender

0 Karma

laiyongmao
Path Finder

I met the same problem as you, I think someone said setting a time stamp, but I think it is a bug.if you know please tell me how you solved it, thank you.

http://answers.splunk.com/answers/114009/dbconnect-update-data-error

Get Updates on the Splunk Community!

The OpenTelemetry Certified Associate (OTCA) Exam

What’s this OTCA exam? The Linux Foundation offers the OpenTelemetry Certified Associate (OTCA) credential to ...

From Manual to Agentic: Level Up Your SOC at Cisco Live

Welcome to the Era of the Agentic SOC   Are you tired of being a manual alert responder? The security ...

Splunk Classroom Chronicles: Training Tales and Testimonials (Episode 4)

Welcome back to Splunk Classroom Chronicles, our ongoing series where we shine a light on what really happens ...