All Apps and Add-ons

Monitoring Postgres Table

autovhcdev
New Member

We have a "stats" table on a postgres server, does anyone know how to get splunk to monitor this? I suspect it involves a script... someone must have already done something like this?

Tags (1)
0 Karma

southeringtonp
Motivator

If you just want to dump the contents of that table every XXX minutes, it should be very easy to do.

Just write a shell script or batch file that runs the command-line postgres client and dumps the table(s) you want, and have Splunk index the output. Basically, any query you can run at the command line would do.

Take a look at the documentation on scripted inputs - that should help get you started.

http://www.splunk.com/base/Documentation/4.1.5/Admin/Setupcustom(scripted)inputs


If the table you want to monitor is continually growing (i.e., you're continually logging stats over time), then your problem is the same as for any other application that logs to a database.

You may wish to consider having whatever populates the stats table log directly to Splunk, if that's feasible. Assuming it isn't, then you need to do a little more work scripting, and you should consider using Python instead of dealing with shell-scripts and psql:

It depends how your table is structured, but here's a common approach if your table has an an increasing primary key value or timestamp:

  • Keep track of the last-seen record ID in a file
  • Build your SQL query to retrieve all records with ID values higher than the last one you saw
  • Dump the results of the table to stdout
  • Update the file with the highest ID value retrieved by your query
0 Karma

southeringtonp
Motivator

To be clear, the original suggestion was not advocating dumping the entire database, just the results of a single query. It depends on how your data is structured though - if you're continually adding new records, then it's more like a traditional log table than just a list of stats. See edits above for more information.

0 Karma

Lowell
Super Champion

What do you mean by "directly"? Splunk only indexed textual data, so at some point the records have to be converted into a text-based format that splunk can index. There is really no concept of "adapters" or other such product-specific things, if that's what your thinking.

0 Karma

autovhcdev
New Member

the database is way too large to dump out, is there a way to index directly?

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...