Hi,
I have a requirement to index data into Splunk from a number of databases.
Is there an easy way to set up multiple connections (on different port, to a different database) using same identity and for the same query - using Splunk DB connect app.
Thank you,
Nav
You mean easier than cloning connections and inputs through the DB Connect GUI?
You can create one set of connection and input in the GUI and then for the remaining items edit the db_connections.conf and db_inputs.conf configuration files in etc/apps/splunk_app_db_connect/local/
by copy pasting the initial config generated there and modifying key settings like stanza names and database host/port.
You could even create a template based on the config generated from the GUI and use some script to generate a whole bunch of config stanzas that you then paste into those files.
Note: when you use DB Connect in rising column mode, you also need to create a checkpoint file in /opt/splunk/var/lib/splunk/modinputs/server/splunk_app_db_connect/
(named after the input) with similar content as the file generated there for the input created through the GUI.
Hi Frank,
Yes, compared to cloning the connections and inputs through GUI, cli is pretty easy. But, can we do something like in db_connections.conf or any other relevant conf file like db_connections.type, pls bear with me
[Sybase_testing]
connection_type = sybase_ase
fetch_size = 1000
host = ((host1|port1|database1)(host2|port2|database2)..)
identity = Sybase_DB_TEST
jdbcUrlFormat = jdbc:sybase:Tds::/
jdbcUseSSL = 0
port = 27300
and use this single connection in the input.conf.
No, I don't think it works that way unfortunately 🙂
I know of a case where >1000 oracle servers were onboarded for audit log collection through DBX and there they used a script to generate the conf stanzas based on a template and a list of connection details. Think that would be your best bet if you have to onboard a lot of DBX sources.
I got a question considering your case with over 1000 db onboarding. How many server with DB Connect AddOn did you need for that? Just heard that one DB-connect Indexer/Heavy Forwarder can handle about 70 connections depending on environment and logs.
Not sure about those details unfortunately.
I guess in any case it strongly depends on the amount of events coming from those database servers as well. Probably best to start small and scale out slowly, while closely monitoring for issues and somehow checking if you're not missing data.
Or steer away from DBX for this at all and collect the database's audit logs locally using a UF, or have the database export it over syslog or so (if supported).
Thank you for your response Frank.
Looks like such kind of bulk operations are available in 3.1.0v of DB_connect app. though I need to test it if it meets my requirement.
Will check that if it doens't I might also need to work on scripting like you mentioned 🙂
Ah, right, never noticed that. But only for inputs by the looks of it, so you would still have to define all the connections first. But then you might be able to script the connection confs and then use that bulk feature to create the inputs. That should then perhaps also make the checkpoint/rising column bit a lot easier and less tricky.
yeah hoping so, will try..