All Apps and Add-ons

How do I collect SharePoint audit data using DBConnect

dperre_splunk
Splunk Employee
Splunk Employee

Hi There,

I am looking for a way to get SharePoint audit data into Splunk via DBConnect. Does anyone have a working script that I can use?

1 Solution

dperre_splunk
Splunk Employee
Splunk Employee

Here is a script that may work for you :).

Notes:
Change the name of the host and index. Depending on how busy your SharePoint server is the max_rows may not be enough so keep an eye on this

[OuterSharepointDB]
connection = SharepointDB
host = SharepointDBHostname
index = sharepoint
interval = 58
max_rows = 1000
mode = batch
query = SELECT
SiteId,
ItemId,
ItemType,
UserId,
AppPrincipalId,
MachineName,
MachineIp,
DocLocation,
LocationType,
Occurred,
Event,
EventName,
EventSource,
SourceName,
tp_SiteID,
tp_ID,
tp_DomainGroup,
tp_Deleted,
tp_SiteAdmin,
tp_IsActive,
tp_Login,
tp_Title,
tp_Email,
tp_Notes,
tp_Token,
tp_ExternalTokenLastUpdated,
PortalURL,
PortalName
FROM "WSS_Content_SCP"."dbo"."AuditData"
WITH (NOLOCK)
LEFT JOIN "WSS_Content_SCP"."dbo"."UserInfo"
ON "WSS_Content_SCP"."dbo"."AuditData"."UserId" = "WSS_Content_SCP"."dbo"."UserInfo"."tp_ID"
LEFT JOIN "WSS_Content_SCP"."dbo"."AllSites"
ON "WSS_Content_SCP"."dbo"."AuditData"."SiteId" = "WSS_Content_SCP"."dbo"."AllSites"."Id"
WHERE "WSS_Content_SCP"."dbo"."AuditData"."Occurred" > DATEADD(minute, -60, GETDATE() )
AND "WSS_Content_SCP"."dbo"."AuditData"."UserId" = "WSS_Content_SCP"."dbo"."UserInfo"."tp_ID"
AND "WSS_Content_SCP"."dbo"."AuditData"."SiteId" = "WSS_Content_SCP"."dbo"."UserInfo"."tp_SiteID"
AND UserId!=1
AND tp_Title NOT LIKE 'System Account'
AND tp_Title NOT LIKE 'NT AUTHORITY\LOCAL SERVICE'
ORDER BY "WSS_Content_SCP"."dbo"."AuditData"."Occurred" DESC

View solution in original post

dperre_splunk
Splunk Employee
Splunk Employee

Here is a script that may work for you :).

Notes:
Change the name of the host and index. Depending on how busy your SharePoint server is the max_rows may not be enough so keep an eye on this

[OuterSharepointDB]
connection = SharepointDB
host = SharepointDBHostname
index = sharepoint
interval = 58
max_rows = 1000
mode = batch
query = SELECT
SiteId,
ItemId,
ItemType,
UserId,
AppPrincipalId,
MachineName,
MachineIp,
DocLocation,
LocationType,
Occurred,
Event,
EventName,
EventSource,
SourceName,
tp_SiteID,
tp_ID,
tp_DomainGroup,
tp_Deleted,
tp_SiteAdmin,
tp_IsActive,
tp_Login,
tp_Title,
tp_Email,
tp_Notes,
tp_Token,
tp_ExternalTokenLastUpdated,
PortalURL,
PortalName
FROM "WSS_Content_SCP"."dbo"."AuditData"
WITH (NOLOCK)
LEFT JOIN "WSS_Content_SCP"."dbo"."UserInfo"
ON "WSS_Content_SCP"."dbo"."AuditData"."UserId" = "WSS_Content_SCP"."dbo"."UserInfo"."tp_ID"
LEFT JOIN "WSS_Content_SCP"."dbo"."AllSites"
ON "WSS_Content_SCP"."dbo"."AuditData"."SiteId" = "WSS_Content_SCP"."dbo"."AllSites"."Id"
WHERE "WSS_Content_SCP"."dbo"."AuditData"."Occurred" > DATEADD(minute, -60, GETDATE() )
AND "WSS_Content_SCP"."dbo"."AuditData"."UserId" = "WSS_Content_SCP"."dbo"."UserInfo"."tp_ID"
AND "WSS_Content_SCP"."dbo"."AuditData"."SiteId" = "WSS_Content_SCP"."dbo"."UserInfo"."tp_SiteID"
AND UserId!=1
AND tp_Title NOT LIKE 'System Account'
AND tp_Title NOT LIKE 'NT AUTHORITY\LOCAL SERVICE'
ORDER BY "WSS_Content_SCP"."dbo"."AuditData"."Occurred" DESC

noelflorendo
Observer

@dperre_splunk Hi mate, can you elaborate where should i put your script? to connect my sharepoint data to splunk. Thanks!

0 Karma

dperre_splunk
Splunk Employee
Splunk Employee

@noelflorendo This will need to be on a heavy forwarder with dbconnect installed 🙂

0 Karma
Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

Watch On Demand the Tech Talk on November 6 at 11AM PT, and empower your SOC to reach new heights! Duration: ...

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...