All Apps and Add-ons

How can we get data from Tables to Splunk without using DB connect app.

Anantha123
Communicator

Hi,
I want to get the tables from Database and inject to Splunk .
My Company doesnot want to include DBConnect .
Is there any way we can get the data to Splunk other than extracting the table to lookuptables manually.
I am looking for an Automatic option where the lookup files get updated in certain Time intervals in splunk .

Thanks
Anantha.

0 Karma

Anantha123
Communicator

Thank you @gcusello and @rich7177 for the inputs. I will check with my manager and act accordingly . These inputs are very great valuable one's

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi Anantha123,
if you can install a Universal Forwarder on your DB Server, you can schedule a query on your DB that writes results on a text file (csv or txt) and then read this file with the Universal Forwarder.
Obviously you haven't real time but (if the load of you DB Servers permits) you can schedule a frequent extraction (e.g. query every one or two minutes) and have a near real time situation.

Then you have to schedule a job that deletes older files (e.g. from the previous hour or day), from your file system.

The real problem is to create a query that extract ony new data to avoid to ingest duplicates that means more license consuption!
If license isn't a problem, you can manage duplicates in your searches.

Bye.
Giuseppe

0 Karma

Richfez
SplunkTrust
SplunkTrust

Your best method is at the end of this. But an interim way to get this done is provided in the meantime:

It isn't absolutely 100% required to have DB Connect to get data in from a DB into Splunk. If you can write a table to a CSV file (or other types, too), you can ingest that or use it as a lookup as long as you can get it from wherever it's written to your Splunk server.

The elephant in the room is "How can you write tables out to disk from my RDBMS?" Well, if we're talking Microsoft SQL Server, BCP and SSIS can both do this, and there's some newer products too. They can schedule a package that involves dumping the data out (with any transformations needed). For other products, you'll have to search the interwebs for your product for "bulk export" or "schedule export" or some such.

Once you have a file, you can use a monitor stanza (e.g. "tail the file" in Splunk to read the data in (might have to copy it to the Splunk system). Or you could use it as a batch/sinkhole input and clean it each time. Or even if it's not huge, just use it as it is as a lookup.

BUT
I believe your best direction is to start a conversation with management about why you need such data, and why DB Connect is the right way to do this. It's really a business decision - does the usefulness of the app and the far easier, better, more timely, accurate, and versatile access you'll have to the data outweigh whatever concerns they have with DB Connect, or not -and what concerns are they?

Having seen this a lot, my guess is no one's actually evaluated the pros and cons, done a risk assessment and decided "Nope, not doing that." What instead happened was someone asked Randy, and Randy being Randy he said "What, just let some random Splunk server talk to my precious data? NO WAY!". But it's not Randy's data, and that's the point. It's the businesses data, and if it's found that the use of having it accessible directly in Splunk outweighs the concerns, then the business people will push to make it happen.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...