Archive

AWS Database logs to Splunk

Explorer

Hi All,

What is the best recommended way to get AWS Database logs to Splunk. Is it already a part of AWS Addon which captures Cloudtrail and Cloudwatch logs. Or do we need a DB connect integration for getting AWS database logs into splunk.

Logs which I we are looking is like create table, drop table etc.

Tags (1)
0 Karma
1 Solution

Ultra Champion

Hi @samadmemon

Have you seen this:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_LogAccess.Concepts.MariaDB.html#USER_Log...
(You can also do for MySQL, Oracle , and Postgres)

Once the logs are in CloudWatch, you can use the AWS Add-On (And optionally install the AWS App) to index the cloudwatch logs.
https://splunkbase.splunk.com/app/1876/
https://splunkbase.splunk.com/app/1274/

The Add-On will will allow you to configure inputs to collect cloudwatch logs.
Documentation:
https://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs

View solution in original post

0 Karma

New Member

In the guide https://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs it says Splunk strongly recommends against using the CloudWatch Logs inputs to collect VPC Flow Logs data (source type: aws:cloudwatchlogs:vpcflow) since the input type will be deprecated in upcoming releases. Does this relate to RDS as well or is RDS Safe?

  1. Log group A comma-separated list of log group names. Is there a size limit to this filed. Also is there an API call that update this value when a new instance is created ,1. From https://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs is says Splunk strongly recommends against using the CloudWatch Logs inputs to collect VPC Flow Logs data. Does this recommendation relates to RDS Logs as well?

  2. configuration file. In the Log group section, I have to specify name of the instance /aws/rds/instance/service-psql-prod/postgresql is there a limit in length of this field ? I have hundreds of servers and would like to update this list frequently when new server goes online. Is there an API that can do this?

0 Karma

Explorer

Hi,

For multiple instances you can configure the Cloudwatch logs to be sent to a S3 bucket in a folder based structure and integrate the S3 bucket with Splunk.

In that way you can use a single Splunk input for multiple instances sending logs to a single s3 bucket.

0 Karma

Explorer

Thanks @nickhillscpl . I was able to integrate DB logs by publishing the logs to Cloudwatch.

Only issue is logs are not parsing using the AWS addon. Could you please advise or the only option now is to use field extractions to parse the logs.

0 Karma

Engager

Did you mean that the logs are not parsing as Oracle events? That's what I am running in to. I have all the metrics and logs pulling from cloudwatch / cloudwatch logs - but am having trouble figuring out how to run them through the Oracle Add on which is what would parse the logs. The AWS app is just looking at the perf metrics and descriptions but doesn't have anything native for the DB logs.

How did you solve that part?

0 Karma

Engager

actually yours is SQL, but same idea for Oracle i would think...

0 Karma

New Member

Did you ever have any luck with this?

0 Karma

Explorer

Yes, please let me know how could I help

0 Karma

Ultra Champion

Hi @samadmemon

Have you seen this:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_LogAccess.Concepts.MariaDB.html#USER_Log...
(You can also do for MySQL, Oracle , and Postgres)

Once the logs are in CloudWatch, you can use the AWS Add-On (And optionally install the AWS App) to index the cloudwatch logs.
https://splunkbase.splunk.com/app/1876/
https://splunkbase.splunk.com/app/1274/

The Add-On will will allow you to configure inputs to collect cloudwatch logs.
Documentation:
https://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs

View solution in original post

0 Karma