All Apps and Add-ons

DBConnect duplicates data

lodani
Engager

In DBConnect I configured a tailed database input and is working fine. I use a true incremental field to track inserts and updates. But every update is added to the index as duplicated events.

Is there any way to configure dbconnect to recognize updated rows by its primary key and avoid duplicates?

skender27
Contributor

Hi,

I resolved the almost the same issue (I am usong the 2.3 version of the app) by setting the ADVANCED tab setting for the rising column.
In fact you should put the '?' in the query to tell Splunk start the query (depending on the cron schedule) starting from a threshold (bigger or smaller value) so you will not have the same results and duplicates any time the query is executed.

Try reading the user guide of the app dbxconnect:
http://docs.splunk.com/Documentation/DBX/2.3.0/DeployDBX/HowSplunkDBConnectworks

Have a nice day,
Skender

0 Karma

laiyongmao
Path Finder

I met the same problem as you, I think someone said setting a time stamp, but I think it is a bug.if you know please tell me how you solved it, thank you.

http://answers.splunk.com/answers/114009/dbconnect-update-data-error

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...