Getting Data In

SharePoint ULS logs

Bulluk
Path Finder

Has anyone indexed SharePoint ULS logs? I've edited my inputs.conf to index my directory but I end up with multiple sourcetypes in Splunk which looks like Server-1, Server-2, Server-3 etc. What sourcetype should I define in the inputs.conf? Below is what I have right now:

[default]
host = Server

[script://$SPLUNK_HOME\bin\scripts\splunk-perfmon.path]
disabled = 0

[monitor://C:\inetpub\logs\logfiles]
sourcetype=iis

[monitor://C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS]

Thanks in advance

0 Karma
1 Solution

Bulluk
Path Finder

It took a lot of reading but I got there in the end. If anyone else has this requirement, this is what I did:

On the forwarder

**inputs.conf**

[default]
host = UKMLWSPW102

[monitor://c:\PathToULSLogs]
SOURCETYPE=uls

**props.conf**

[uls]
CHECK_FOR_HEADER = False

On the indexer

**props.conf**

[uls]
TIME_PREFIX = :\s
MAX_TIMESTAMP_LOOKAHEAD = 128
TIME_FORMAT = %m-%d-%Y %H:%M:%S.%3N
REPORT-uls = uls
TZ = GMT

**transforms.conf**
[uls]
FIELDS="Timestamp", "Process", "TID", "Area", "Category", "EventID", "Level", "Message", "Correlation"
DELIMS = "\t"

So what this does is override the default Splunk behaviour of checking the log file header on the forwarder, meaning that the indexer receives the data "untouched". On the indexer, props.conf sets the time stamp and timezone while transforms.conf tells Splunk that the file is tab-deliminated and gives it the names of the columns.

View solution in original post

0 Karma

Bulluk
Path Finder

It took a lot of reading but I got there in the end. If anyone else has this requirement, this is what I did:

On the forwarder

**inputs.conf**

[default]
host = UKMLWSPW102

[monitor://c:\PathToULSLogs]
SOURCETYPE=uls

**props.conf**

[uls]
CHECK_FOR_HEADER = False

On the indexer

**props.conf**

[uls]
TIME_PREFIX = :\s
MAX_TIMESTAMP_LOOKAHEAD = 128
TIME_FORMAT = %m-%d-%Y %H:%M:%S.%3N
REPORT-uls = uls
TZ = GMT

**transforms.conf**
[uls]
FIELDS="Timestamp", "Process", "TID", "Area", "Category", "EventID", "Level", "Message", "Correlation"
DELIMS = "\t"

So what this does is override the default Splunk behaviour of checking the log file header on the forwarder, meaning that the indexer receives the data "untouched". On the indexer, props.conf sets the time stamp and timezone while transforms.conf tells Splunk that the file is tab-deliminated and gives it the names of the columns.

0 Karma

Bulluk
Path Finder

The best I can offer you is use the transaction command with the TID or correlation column and then search within that. Not really what you're after but I'm a novice myself

0 Karma

neilamoran
Explorer

Did you manage to get around the issues with multiline events in ULS logs? I'm kind of halfway to resolving that one, but it breaks the field extractions.

See my post at http://splunk-base.splunk.com/answers/28974/multiline-event-query-sharepoint-logs for more details.

0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...