All Apps and Add-ons

DB Connect inputs.conf related questions

yoyu777
Explorer

1) Where should I put the DB Connect inputs.conf? in the same inputs.conf that Splunk uses (in $SPLUNK_HOME/etc/system/local/)?

2) How can I configure the inputs.conf for a specific connection?

3) How can I find more detailed specifications of what each parameter mean for the DB Connect inputs.conf?
For example,
What does the "unique_key = " mean in "mi_output"

The problem we are trying to solve is that we have a database that is monitored by Splunk DB Connect based on a rising column "last_updated_time". Each row has a unique ID. But currently if the "last_updated_time" is changed, the old record with the same ID is not replaced in Splunk (so we are getting duplicated records for each ID). For the purpose of the analysis, we only want to analyse the latest record of each ID. I guess our use case is not uncommon, wonder what is the best way to achieve the purpose in Splunk?

0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

If you use the DB Connect UI to configure inputs you don't have to worry about where the config file goes as the app will take care of that.

The full documentation for DB Connect's inputs.conf file is here.

DB Connect does not update data in Splunk. Once data has been indexed in Splunk it cannot be altered in any way. It is up to your search commands to filter out duplicate data (perhaps by using dedup uniqueID last_updated_time).

---
If this reply helps you, Karma would be appreciated.

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

If you use the DB Connect UI to configure inputs you don't have to worry about where the config file goes as the app will take care of that.

The full documentation for DB Connect's inputs.conf file is here.

DB Connect does not update data in Splunk. Once data has been indexed in Splunk it cannot be altered in any way. It is up to your search commands to filter out duplicate data (perhaps by using dedup uniqueID last_updated_time).

---
If this reply helps you, Karma would be appreciated.
Get Updates on the Splunk Community!

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...

Application management with Targeted Application Install for Victoria Experience

Experience a new era of flexibility in managing your Splunk Cloud Platform apps! With Targeted Application ...