Hello,
In our unique environment, we face some limitations. We cannot directly install Splunk forwarders on the database servers, nor can we create a Splunk user account within the databases.
Here’s the situation:
The challenge lies in the fact that SQLAudit files are in a native format that Splunk cannot directly interpret.
While the ideal solution would be to install forwarders directly on the original DB servers (which is not feasible for us), we also recognize that using DB connect and creating a Splunk account on the DB is not an option.
Given these constraints, are there any other viable options we can explore?
Best regards,
Hi @Ismail_BSA,
We can use the SqlServer PowerShell module to read SQL Server audit files.
As an administrator, install the SqlServer PowerShell module under PowerShell 5.1, which should be installed by default on all modern Windows releases:
PS> Install-Module SqlServer
With the module installed, we can read .sqlaudit files created by SQL Server using Read-SqlXEvent. Column/field information is available at https://learn.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-record.... Columns with type bigint or varbinary will be read as byte arrays and must be converted to strings using a .NET object of the appropriate type.
We can write a small PowerShell script to act as a stream reader for .sqlaudit files read by Splunk's archive processor (see below). Note that Read-SqlXEvent uses System.IO.Stream internally and calls Stream.Length, which throws "Stream does not support seeking" for forward-only streams. We'll work around this isssue by copying the stream to a temporary file, reading the temporary file, and finally, deleting the temporary file.
C:\Temp\Stream-SqlAudit.ps1
$file = New-TemporaryFile
$output = $file.Open([System.IO.FileMode]::Append, [System.IO.FileAccess]::Write)
$stdin = [System.Console]::OpenStandardInput()
$stdout = [System.Console]::Out
$buffer = New-Object byte[] 16384
[int]$bytes = 0
while (($bytes = $stdin.Read($buffer, 0, $buffer.Length)) -gt 0) {
$output.Write($buffer, 0, $bytes)
}
$output.Flush()
$output.Close()
Read-SqlXEvent -FileName "$($file.DirectoryName)\$($file.Name)" | %{
$event = $_.Timestamp.UtcDateTime.ToString("o")
$_.Fields | %{
if ($_.Key -eq "permission_bitmask") {
$event += " permission_bitmask=`"0x$([System.BitConverter]::ToInt64($_.Value, 0).ToString("x16"))`""
}
elseif ($_.Key -like "*_sid") {
$sid = $null
$event += " $($_.Key)=`""
try {
$sid = New-Object System.Security.Principal.SecurityIdentifier($_.Value, 0)
$event += "$($sid.ToString())`""
}
catch {
$event += "`""
}
}
else {
$event += " $($_.Key)=`"$([System.Web.HttpUtility]::JavaScriptStringEncode($_.Value.ToString()))`""
}
}
$stdout.WriteLine($event)
}
$file.Delete()
We can use the invalid_cause and unarchive_cmd props.conf settings to call the PowerShell script. Note that unarchive_cmd strips or escapes quotes depending on the value unarchive_cmd_start_mode, so we've stored the PowerShell script in a path without spaces to avoid the use of quotes. If PowerShell can't find the path specified in the -File argument, it will exit with code -196608.
Sample props.conf on forwarders, receivers (heavy forwarders or indexers), and search heads:
[source::....sqlaudit]
unarchive_cmd = powershell.exe -ExecutionPolicy RemoteSigned -File C:\Temp\Stream-SqlAudit.ps1
unarchive_cmd_start_mode = direct
sourcetype = preprocess-sqlaudit
NO_BINARY_CHECK = true
[preprocess-sqlaudit]
invalid_cause = archive
is_valid = False
LEARN_MODEL = false
[sqlaudit]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = ([\r\n]+)
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%N%Z
MAX_TIMESTAMP_LOOKAHEAD = 30
KV_MODE = auto_escaped
We can use a batch or monitor stanza to monitor the directory containing .sqlaudit files. Use a batch stanza if the files are moved to the monitored directory atomically to allow Splunk Universal Forwarder to delete the files after they're indexed.
Sample inputs.conf:
[monitor://C:\Temp\*.sqlaudit]
index = main
sourcetype = sqlaudit
The script can be refactored as a scripted input; however, using the archive processor allows Splunk to perform file and change tracking on our behalf.
Hello @tscroggins
Thank you for your follow up. However, I am not sure that we can apply this solution to our environment since our Server C (the one with the universal forwarder) is not a Windows server . The other servers (type A and B) are not connected to any external networks.
After some internal discussions, a new idea was proposed and I am not sure if it could work, @tscroggins , @isoutamo , @scelikok I appreciate if you share your feed backs about it.
The idea is to deploy a heavy forwarder on server B and a Universal forwarder on server C (the one connected to Splunk cloud)
Do you think that this is a feasable set up ?
Best regards,
Also, yes, your proposal to install Splunk Enterprise on Server B and Splunk Universal Forwarder on Server C will allow you to run queries against Server A, assuming you have connectivity and a database account with appropriate access, and forward the evens to Server C and downstream to Splunk Cloud.
Note, however, that sys.fn_get_audit_file does not scale. If you query .sqlaudit files through this function, your SQL Server administrator should store only the .sqlaudit files necessary to allow Splunk to execute queries and index events in a timely manner. I.e. Rotation and retention of live .sqlaudit files should be configured with Splunk and fn_get_audit_file performance in mind. You'll need to test performance in your environment to understand its constraints.
Thank you @tscroggins
This should work on Linux:
Install PowerShell Core.
As the Splunk Universal Forwarder user--splunk or splunkfwd--install the SqlServer module as before:
$ /bin/pwsh
PS> Install-Module SqlServer
If Splunk Universal Forwarder runs as root, install the SqlServer module as root.
Copy Stream-SqlAudit.ps1 to an appropriate directory, e.g. $SPLUNK_HOME/bin/scripts. Note the addition of the interpreter directive on the first line.
#!/bin/pwsh
$file = New-TemporaryFile
$output = $file.Open([System.IO.FileMode]::Append, [System.IO.FileAccess]::Write)
$stdin = [System.Console]::OpenStandardInput()
$stdout = [System.Console]::Out
$buffer = New-Object byte[] 16384
[int]$bytes = 0
while (($bytes = $stdin.Read($buffer, 0, $buffer.Length)) -gt 0) {
$output.Write($buffer, 0, $bytes)
}
$output.Flush()
$output.Close()
Read-SqlXEvent -FileName "$($file.DirectoryName)\$($file.Name)" | %{
$event = $_.Timestamp.UtcDateTime.ToString("o")
$_.Fields | %{
if ($_.Key -eq "permission_bitmask") {
$event += " permission_bitmask=`"0x$([System.BitConverter]::ToInt64($_.Value, 0).ToString("x16"))`""
}
elseif ($_.Key -like "*_sid") {
$sid = $null
$event += " $($_.Key)=`""
try {
$sid = New-Object System.Security.Principal.SecurityIdentifier($_.Value, 0)
$event += "$($sid.ToString())`""
}
catch {
$event += "`""
}
}
else {
$event += " $($_.Key)=`"$([System.Web.HttpUtility]::JavaScriptStringEncode($_.Value.ToString()))`""
}
}
$stdout.WriteLine($event)
}
$file.Delete()
Make sure the file is executable, e.g.:
$ chmod 0750 $SPLUNK_HOME/bin/scripts/Stream-SqlAudit.ps1
Update props.conf:
[source::....sqlaudit]
unarchive_cmd = $SPLUNK_HOME/bin/scripts/Stream-SqlAudit.ps1
unarchive_cmd_start_mode = direct
sourcetype = preprocess-sqlaudit
NO_BINARY_CHECK = true
[preprocess-sqlaudit]
invalid_cause = archive
is_valid = False
LEARN_MODEL = false
[sqlaudit]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = ([\r\n]+)
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%N%Z
MAX_TIMESTAMP_LOOKAHEAD = 30
KV_MODE = auto_escaped
Update inputs.conf:
[monitor:///tmp/*.sqlaudit]
index = main
sourcetype = sqlaudit
/tmp is just an example. Use whatever file system and path makes the most sense for your deployment. The Splunk Universal Forwarder user must have read and execute access to all directories in the path and read access to the .sqlaudit files.
As before, your temporary directory should have enough free space to accommodate your largest .sqlaudit file. Depending on your Splunk configuration, Splunk Universal Forwarder may process multiple files concurrently. If that's the case, ensure you have enough free space for all temporary files.
Finally, let us know how it goes!
The SqlServer module is available for PowerShell Core on various platforms. I can test a solution on Linux x86-64, but I don't have access to a macOS or ARM host. What platform is Server C?
Server C is SUSE 15
Maybe? The prototype is here for anyone to grab. I'd need to find the time and resources for long-term maintenance of an app: development, build and integration, support, etc.
Note that I used System.Web.HttpUtility.JavaScriptStringEncode as a shortcut for encoding/escaping strings. KV_MODE = auto_escaped only handles a few escape sequences. If you prefer, you can simply replace \ and " with \\ and \", respectively, in strings before writing them.
Hi @Ismail_BSA,
Splunk cannot convert/read these binary files. Maybe you can install SQLServer on Server C, import these audit files into that SQLServer, and query with DBConnect.