Splunk Cloud Platform

uploading sqlaudit files into Splunk

Ismail_BSA
Path Finder

Hello,

In our unique environment, we face some limitations. We cannot directly install Splunk forwarders on the database servers, nor can we create a Splunk user account within the databases.

Here’s the situation:

  1. Server A (DB server): Our databases generate SQLAudit files.
  2. Server B (Relay): These SQLAudit files are transmitted from server A to a different 'relay' server (let’s call it Server B). Unfortunately, Server B also cannot accommodate Splunk forwarders.
  3. Server C (Universal Forwarder): From Server B, the audit files are further transmitted to another server (Server C). On Server C, we have a Splunk Universal forwarder that should upload the SQLAudit files to our Splunk Cloud instance.

The challenge lies in the fact that SQLAudit files are in a native format that Splunk cannot directly interpret.

While the ideal solution would be to install forwarders directly on the original DB servers (which is not feasible for us), we also recognize that using DB connect and creating a Splunk account on the DB is not an option.

Given these constraints, are there any other viable options we can explore?

Best regards,

Labels (1)
Tags (3)
0 Karma

tscroggins
Influencer

Hi @Ismail_BSA,

We can use the SqlServer PowerShell module to read SQL Server audit files.

As an administrator, install the SqlServer PowerShell module under PowerShell 5.1, which should be installed by default on all modern Windows releases:

PS> Install-Module SqlServer

With the module installed, we can read .sqlaudit files created by SQL Server using Read-SqlXEvent. Column/field information is available at https://learn.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-record.... Columns with type bigint or varbinary will be read as byte arrays and must be converted to strings using a .NET object of the appropriate type.

We can write a small PowerShell script to act as a stream reader for .sqlaudit files read by Splunk's archive processor (see below). Note that Read-SqlXEvent uses System.IO.Stream internally and calls Stream.Length, which throws "Stream does not support seeking" for forward-only streams. We'll work around this isssue by copying the stream to a temporary file, reading the temporary file, and finally, deleting the temporary file.

C:\Temp\Stream-SqlAudit.ps1

$file = New-TemporaryFile
$output = $file.Open([System.IO.FileMode]::Append, [System.IO.FileAccess]::Write)
$stdin = [System.Console]::OpenStandardInput()
$stdout = [System.Console]::Out
$buffer = New-Object byte[] 16384
[int]$bytes = 0

while (($bytes = $stdin.Read($buffer, 0, $buffer.Length)) -gt 0) {
    $output.Write($buffer, 0, $bytes)
}

$output.Flush()
$output.Close()

Read-SqlXEvent -FileName "$($file.DirectoryName)\$($file.Name)" | %{
    $event = $_.Timestamp.UtcDateTime.ToString("o")
    $_.Fields | %{
        if ($_.Key -eq "permission_bitmask") {
            $event += " permission_bitmask=`"0x$([System.BitConverter]::ToInt64($_.Value, 0).ToString("x16"))`""
        }
        elseif ($_.Key -like "*_sid") {
            $sid = $null
            $event += " $($_.Key)=`""

            try {
                $sid = New-Object System.Security.Principal.SecurityIdentifier($_.Value, 0)
                $event +=  "$($sid.ToString())`""
            }
            catch {
                $event += "`""
            }
        }
        else {
            $event +=  " $($_.Key)=`"$([System.Web.HttpUtility]::JavaScriptStringEncode($_.Value.ToString()))`""
        }
    }

    $stdout.WriteLine($event)
}

$file.Delete()

We can use the invalid_cause and unarchive_cmd props.conf settings to call the PowerShell script. Note that unarchive_cmd strips or escapes quotes depending on the value unarchive_cmd_start_mode, so we've stored the PowerShell script in a path without spaces to avoid the use of quotes. If PowerShell can't find the path specified in the -File argument, it will exit with code -196608.

Sample props.conf on forwarders, receivers (heavy forwarders or indexers), and search heads:

[source::....sqlaudit]
unarchive_cmd = powershell.exe -ExecutionPolicy RemoteSigned -File C:\Temp\Stream-SqlAudit.ps1
unarchive_cmd_start_mode = direct
sourcetype = preprocess-sqlaudit
NO_BINARY_CHECK = true

[preprocess-sqlaudit]
invalid_cause = archive
is_valid = False
LEARN_MODEL = false

[sqlaudit]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = ([\r\n]+)
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%N%Z
MAX_TIMESTAMP_LOOKAHEAD = 30
KV_MODE = auto_escaped

We can use a batch or monitor stanza to monitor the directory containing .sqlaudit files. Use a batch stanza if the files are moved to the monitored directory atomically to allow Splunk Universal Forwarder to delete the files after they're indexed.

Sample inputs.conf:

[monitor://C:\Temp\*.sqlaudit]
index = main
sourcetype = sqlaudit

The script can be refactored as a scripted input; however, using the archive processor allows Splunk to perform file and change tracking on our behalf.

Ismail_BSA
Path Finder

Hello @tscroggins 

 

Thank you for your follow up. However, I am not sure that we can apply this solution to our environment since our Server C (the one with the universal forwarder) is not a Windows server . The other servers (type A and B) are not connected to any external networks.

After some internal discussions, a new idea was proposed and I am not sure if it could work, @tscroggins , @isoutamo , @scelikok I appreciate if you share your feed backs about it.

The idea is to deploy a heavy forwarder on server B and a Universal forwarder on server C (the one connected to Splunk cloud)

  1. Servers A (DB server): Our databases generate SQLAudit files (and probably some Oracle DB audit files from similar servers). No external connections are allowed to these category of servers.
  2. Server B (Relay): This is the only server that can establish communications with the DB servers (category A). On this server, we can install a havy forwarder + DB connect to collect MSsql audit logs ( and Oracle audit logs from oracle servers). Please note that there are no external connections to this server and it cannot directly forwarder to Splunk Cloud.
  3. Server C (Universal Forwarder): The only one with allowed external connections. Havy forwarder on server B sends the collected logs to the universal forwarder on server C.  The Universal forwarder then uploads the SQLAudit files and oracle audit files to the Splunk Cloud instance.
     

     

    Do you think that this is a feasable set up ? 

    Best regards,

 

0 Karma

tscroggins
Influencer

Also, yes, your proposal to install Splunk Enterprise on Server B and Splunk Universal Forwarder on Server C will allow you to run queries against Server A, assuming you have connectivity and a database account with appropriate access, and forward the evens to Server C and downstream to Splunk Cloud.

Note, however, that sys.fn_get_audit_file does not scale. If you query .sqlaudit files through this function, your SQL Server administrator should store only the .sqlaudit files necessary to allow Splunk to execute queries and index events in a timely manner. I.e. Rotation and retention of live .sqlaudit files should be configured with Splunk and fn_get_audit_file performance in mind. You'll need to test performance in your environment to understand its constraints.

Ismail_BSA
Path Finder

Thank you @tscroggins 

0 Karma

tscroggins
Influencer

This should work on Linux:

Install PowerShell Core.

As the Splunk Universal Forwarder user--splunk or splunkfwd--install the SqlServer module as before:

$ /bin/pwsh
PS> Install-Module SqlServer

If Splunk Universal Forwarder runs as root, install the SqlServer module as root.

Copy Stream-SqlAudit.ps1 to an appropriate directory, e.g. $SPLUNK_HOME/bin/scripts. Note the addition of the interpreter directive on the first line.

#!/bin/pwsh
$file = New-TemporaryFile
$output = $file.Open([System.IO.FileMode]::Append, [System.IO.FileAccess]::Write)
$stdin = [System.Console]::OpenStandardInput()
$stdout = [System.Console]::Out
$buffer = New-Object byte[] 16384
[int]$bytes = 0

while (($bytes = $stdin.Read($buffer, 0, $buffer.Length)) -gt 0) {
    $output.Write($buffer, 0, $bytes)
}

$output.Flush()
$output.Close()

Read-SqlXEvent -FileName "$($file.DirectoryName)\$($file.Name)" | %{
    $event = $_.Timestamp.UtcDateTime.ToString("o")
    $_.Fields | %{
        if ($_.Key -eq "permission_bitmask") {
            $event += " permission_bitmask=`"0x$([System.BitConverter]::ToInt64($_.Value, 0).ToString("x16"))`""
        }
        elseif ($_.Key -like "*_sid") {
            $sid = $null
            $event += " $($_.Key)=`""

            try {
                $sid = New-Object System.Security.Principal.SecurityIdentifier($_.Value, 0)
                $event +=  "$($sid.ToString())`""
            }
            catch {
                $event += "`""
            }
        }
        else {
            $event +=  " $($_.Key)=`"$([System.Web.HttpUtility]::JavaScriptStringEncode($_.Value.ToString()))`""
        }
    }

    $stdout.WriteLine($event)
}

$file.Delete()

Make sure the file is executable, e.g.:

$ chmod 0750 $SPLUNK_HOME/bin/scripts/Stream-SqlAudit.ps1

Update props.conf:

[source::....sqlaudit]
unarchive_cmd = $SPLUNK_HOME/bin/scripts/Stream-SqlAudit.ps1
unarchive_cmd_start_mode = direct
sourcetype = preprocess-sqlaudit
NO_BINARY_CHECK = true

[preprocess-sqlaudit]
invalid_cause = archive
is_valid = False
LEARN_MODEL = false

[sqlaudit]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = ([\r\n]+)
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%N%Z
MAX_TIMESTAMP_LOOKAHEAD = 30
KV_MODE = auto_escaped

Update inputs.conf:

[monitor:///tmp/*.sqlaudit]
index = main
sourcetype = sqlaudit

/tmp is just an example. Use whatever file system and path makes the most sense for your deployment. The Splunk Universal Forwarder user must have read and execute access to all directories in the path and read access to the .sqlaudit files.

As before, your temporary directory should have enough free space to accommodate your largest .sqlaudit file. Depending on your Splunk configuration, Splunk Universal Forwarder may process multiple files concurrently. If that's the case, ensure you have enough free space for all temporary files.

Finally, let us know how it goes!

0 Karma

tscroggins
Influencer

The SqlServer module is available for PowerShell Core on various platforms. I can test a solution on Linux x86-64, but I don't have access to a macOS or ARM host. What platform is Server C?

0 Karma

Ismail_BSA
Path Finder

Server C is SUSE 15

0 Karma

isoutamo
SplunkTrust
SplunkTrust
Nice! Have you planned to create a TA for this in splunkbase? I think that there are quite many of us, who are interested about it!
r. Ismo

tscroggins
Influencer

Maybe? The prototype is here for anyone to grab. I'd need to find the time and resources for long-term maintenance of an app: development, build and integration, support, etc.

0 Karma

tscroggins
Influencer

Note that I used System.Web.HttpUtility.JavaScriptStringEncode as a shortcut for encoding/escaping strings. KV_MODE = auto_escaped only handles a few escape sequences. If you prefer, you can simply replace \ and " with \\ and \", respectively, in strings before writing them.

0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @Ismail_BSA,

Splunk cannot convert/read these binary files. Maybe you can install SQLServer on Server C, import these audit files into that SQLServer, and query with DBConnect.

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

Ismail_BSA
Path Finder

Hi @scelikok 

 

Thank you for your reply. 

We will test this option and see if works.

Best regards.

0 Karma
Get Updates on the Splunk Community!

Modern way of developing distributed application using OTel

Recently, I had the opportunity to work on a complex microservice using Spring boot and Quarkus to develop a ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security ...

Archived Metrics Now Available for APAC and EMEA realms

We’re excited to announce the launch of Archived Metrics in Splunk Infrastructure Monitoring for our customers ...