Getting Data In

How can I analyze Splunk Phantom's PostgreSQL logs using pgBadger?

awilcox_splunk
Splunk Employee
Splunk Employee

What are the best configuration settings for using pgBadger to analyze Splunk Phantom's PostgreSQL logs?

0 Karma
1 Solution

awilcox_splunk
Splunk Employee
Splunk Employee

PostgreSQL provides configuration parameters to control error reporting and logging. Increased PostgreSQL logging combined with the PostgreSQL log analysis tool pgBadger can provide a better understanding of queries. Logging can be run at an elevated level for a short period of time and once the log data has been collected it can be analyzed using pgBadger.

Here are a few suggested parameters for increased logging, as well as a brief explanation of the pgBadger tool:

Increasing PostgreSQL Logging
PostgreSQL uses only the last value that is set for a parameter. For more information, see https://www.postgresql.org/docs/9.4/config-setting.html . From an administration standpoint, the simplest way to increase logging is to create a separate logging configuration file and then include that configuration file at the end of the main PostgreSQL configuration file. In order to do that, follow these steps:

  1. Create the file postgresql.logging.for.pgbadger.conf and include the following parameters in that file:
    log_min_duration_statement = 200   # log those queries running longer than ... milliseconds
    log_checkpoints = on
    log_connections = on
    log_disconnections = on
    log_lock_waits = on
    log_temp_files = 0
    log_error_verbosity = default
    log_line_prefix = '%m [%p]: '
    log_rotation_size = 1GB   # or whatever size is manageable in the environment
    log_filename = 'postgresql-%a-%H%M%S.log'
    log_autovacuum_min_duration = 0
    
  2. Place the new configuration file alongside PostgreSQL's main configuration file: postgresql.conf. In a default Splunk Phantom installation this would be the PHANTOM_HOME/data/db directory.
  3. Ensure the file ownership and file permissions are the same as the postgresql.conf file.

  4. Edit the postgresql.conf file to include the new parameters by adding the include_if_exists 'postgresql.logging.for.pgbadger.conf' parameter as the last line in the postgresql.conf file.

  5. Reload the PostgreSQL server to pick up the changes. Running queries will not be interrupted or otherwise impacted. Use the systemctl reload postgresql-9.4 command to reload the server.

The amount of additional log data that is generated will depend on the workload on the Splunk Phantom installation. Ideally, the additional logging should be enabled for a period of several hours or more during the system's typically busy time or while an issue under investigation is occurring.

Because of the potential for PostgreSQL to generate more log data than is usual for your environment, the file system holding the logs should be actively monitored to ensure it doesn't run out of disk space. As a result of the log_filename and log_rotation_size parameters you will likely notice additional log files in the PostgreSQL log directory. In a default Splunk Phantom installation, the PostgreSQL log directory is the PHANTOM_HOME/data/db/pg_log directory.

To undo the changes and revert logging to what it previously was, edit the postgresql.conf file again to comment the previously added include_if_exists parameter and reload the system using the systemctl reload postgresql-9.4 command.

Use pgBadger
pgBadger is a PostgreSQL log analyzer and report generator. Here are a few useful links:

• Homepage: http://pgbadger.darold.net/
• Download pgBadger: https://github.com/darold/pgbadger/releases
• Documentation: https://pgbadger.darold.net/documentation.html

To use PgBadger, follow these steps:

  1. Download pgBadger here: https://github.com/darold/pgbadger/releases
  2. Transfer the downloaded file to the system on which the logs will be analyzed and unpack the archive. The perl executable is named pgbadger.
  3. Transfer the PostgreSQL log files to the same system.

There are many command line options explained in pgbadger --help

A command like pgbadger /postgresql/logs/*.log can be used to generate an HTML report named out.html if there are large volumes of log files one useful option is to enable parallel processing on multiple cores with the -j option.

Splunk Phantom is unable to answer questions regarding the support of pgBadger.

View solution in original post

0 Karma

awilcox_splunk
Splunk Employee
Splunk Employee

PostgreSQL provides configuration parameters to control error reporting and logging. Increased PostgreSQL logging combined with the PostgreSQL log analysis tool pgBadger can provide a better understanding of queries. Logging can be run at an elevated level for a short period of time and once the log data has been collected it can be analyzed using pgBadger.

Here are a few suggested parameters for increased logging, as well as a brief explanation of the pgBadger tool:

Increasing PostgreSQL Logging
PostgreSQL uses only the last value that is set for a parameter. For more information, see https://www.postgresql.org/docs/9.4/config-setting.html . From an administration standpoint, the simplest way to increase logging is to create a separate logging configuration file and then include that configuration file at the end of the main PostgreSQL configuration file. In order to do that, follow these steps:

  1. Create the file postgresql.logging.for.pgbadger.conf and include the following parameters in that file:
    log_min_duration_statement = 200   # log those queries running longer than ... milliseconds
    log_checkpoints = on
    log_connections = on
    log_disconnections = on
    log_lock_waits = on
    log_temp_files = 0
    log_error_verbosity = default
    log_line_prefix = '%m [%p]: '
    log_rotation_size = 1GB   # or whatever size is manageable in the environment
    log_filename = 'postgresql-%a-%H%M%S.log'
    log_autovacuum_min_duration = 0
    
  2. Place the new configuration file alongside PostgreSQL's main configuration file: postgresql.conf. In a default Splunk Phantom installation this would be the PHANTOM_HOME/data/db directory.
  3. Ensure the file ownership and file permissions are the same as the postgresql.conf file.

  4. Edit the postgresql.conf file to include the new parameters by adding the include_if_exists 'postgresql.logging.for.pgbadger.conf' parameter as the last line in the postgresql.conf file.

  5. Reload the PostgreSQL server to pick up the changes. Running queries will not be interrupted or otherwise impacted. Use the systemctl reload postgresql-9.4 command to reload the server.

The amount of additional log data that is generated will depend on the workload on the Splunk Phantom installation. Ideally, the additional logging should be enabled for a period of several hours or more during the system's typically busy time or while an issue under investigation is occurring.

Because of the potential for PostgreSQL to generate more log data than is usual for your environment, the file system holding the logs should be actively monitored to ensure it doesn't run out of disk space. As a result of the log_filename and log_rotation_size parameters you will likely notice additional log files in the PostgreSQL log directory. In a default Splunk Phantom installation, the PostgreSQL log directory is the PHANTOM_HOME/data/db/pg_log directory.

To undo the changes and revert logging to what it previously was, edit the postgresql.conf file again to comment the previously added include_if_exists parameter and reload the system using the systemctl reload postgresql-9.4 command.

Use pgBadger
pgBadger is a PostgreSQL log analyzer and report generator. Here are a few useful links:

• Homepage: http://pgbadger.darold.net/
• Download pgBadger: https://github.com/darold/pgbadger/releases
• Documentation: https://pgbadger.darold.net/documentation.html

To use PgBadger, follow these steps:

  1. Download pgBadger here: https://github.com/darold/pgbadger/releases
  2. Transfer the downloaded file to the system on which the logs will be analyzed and unpack the archive. The perl executable is named pgbadger.
  3. Transfer the PostgreSQL log files to the same system.

There are many command line options explained in pgbadger --help

A command like pgbadger /postgresql/logs/*.log can be used to generate an HTML report named out.html if there are large volumes of log files one useful option is to enable parallel processing on multiple cores with the -j option.

Splunk Phantom is unable to answer questions regarding the support of pgBadger.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...