Monitoring Splunk

splunk and shibboleth log analysis

jimennis
Explorer

Has anyone configured Splunk to read the audit logs from Shibboleth to try to summarize the source of the incoming authentication request? The log format seems unique to Shibboleth .

Tags (1)

jsnyderlmco
Engager

link text
The Shibboleth wiki page has information that will be useful in being able to parse the logs.

Fields are extracted at various points through the life of a transaction so that adequate detail can be exposed about the request, the processing of the request, and the response. These extraction points are associated with collections of field extraction beans that do the actual work to pull data out of the state of the transaction and store it for output.

Built-In Fields
The fields that are supported out of the box are as follows (note that not every field is always populated, it depends on the timing of errors and the specific transaction being audited):

Generic Fields
ST Timestamp for start of flow
T Timestamp event is recorded
e WebFlow Event
URL URL
URI URI
DEST Destination URL of outgoing msg
s IdP session ID
AF Authentication flow ID
SSO SSO flag
a Client address
UA User agent string
P Profile ID
u Username
HASHEDu Hashed username
uu Impersonating username
attr Attribute(s)
ROP Requested authentication operator
RPRIN Requested authentication principals
SAML Fields
SP Service provider name
IDP Identity provider name
p Protocol
b Inbound binding
bb Outbound binding
n NameID value
f NameID format
SPQ NameID SPNameQualifier
pf NameIDPolicy required format
PSPQ NameIDPolicy required SPNameQualifier
i Assertion ID
d Assertion timestamp
I Inbound message ID
D Inbound message timestamp
II InResponseTo
III Outbound message ID
DD Outbound message timestamp
t AuthenticationInstant
x SessionIndex
ac AuthenticationContext
S Status code
SS Sub-status code
SM Status message
pasv IsPassive
fauth ForceAuthn
XX Signed inbound messages
X Encrypted assertions
XA Encryption algorithm

0 Karma

wryanthomas
Contributor

The link didn't come through. What wiki page? Maybe try to present the URL in a way that the site's algorithm won't block. (?)

I'm not seeing any link to a wiki from app on Splunkbase, and I'm not seeing anything at github, except a shibb TA that is 4+ years old. (I.e., it appears not to be the one that is on splunkbase by SplunkWorks.)

splunkbase.splunk.com/app/4389/

0 Karma

jsnyderlmco
Engager
0 Karma

jimennis
Explorer

Here is sanitized record (all one line despite the line wrapping from the cut/paste):

20131108T045952Z|urn:oasis:names:tc:SAML:2.0:bindings:HTTP-Redirect|_f2a07ca502e5e77114ee1e34e802145c|https://my.xxx.xxx/shibboleth|urn:mace:shibboleth:2.0:profiles:saml2:sso|https://idp-xxxx.xx.xxxx.xx...

0 Karma

jimennis
Explorer

Hello

Thanks for the tip. I am working on getting a sanitized record for posting. Shibboleth uses the '|' character as it's field delimiter. The first field is a timestamp field. As soon as a colleague checks my sanitizing, I will post the record for reference.

0 Karma

lguinn2
Legend

I have not, but it looks like this would be pretty simple. However, it does appear that the log files (especially the idp-process.log) can be configured, so you would probably have to tune this a bit. Also, the docs I read did not indicate that the log events would have a timestamp, which seems a curious omission. If you could edit your question to supply a few lines of the log files (sanitized of course), the community could be of more help.

In props.conf I would define some new sourcetypes as follows

[idp_access]
REPORT-parse_idp_access
SHOULD_LINEMERGE = false

[idp_audit]
REPORT-parse_idp_audit
SHOULD_LINEMERGE = false

[idp_process]
EXTRACT-idp_process1= (?<loggingLevel>TRACE|DEBUG|INFO|WARN|ERROR)\s\[(?<errorCode>\S+?)\]\s-\s*(?<timestamp>\S+?\|(?<ip>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})\|(?<component>\S+?)\|(?<message>.*)
SHOULD_LINEMERGE = false

in transforms.conf

[parse_idp_access]
DELIMS = "|"
FIELDS = requestTime, remoteHost, serverHost, serverPort, requestPath

[parse_idp_audit]
DELIMS = "|"
FIELDS = auditEventTime, requestBinding, requestId, relyingPartyId, messageProfileId, assertingPartyId, responseBinding, responseId, principalName, authNMethod, releasedAttributeID, nameIdentifier, assertionID

If you use the sourcetypes in your inputs.conf, you should have a start at this. Definitely needs some testing though.

0 Karma

jwalzerpitt
Influencer

Lisa,

Testing the Shibboleth log files on one of our Splunk DEV servers and when I go to add data (uploading via files from my computer) I’m not seeing the Sourcetype listed even though I added the following info into the following two conf files:

• transforms.conf (/opt/splunk/etc/system/local/transforms.conf)

[parse_idp_access]
DELIMS = "|"
FIELDS = requestTime, remoteHost, serverHost, serverPort, requestPath

[parse_idp_audit]
DELIMS = "|"
FIELDS = auditEventTime, requestBinding, requestId, relyingPartyId, messageProfileId, assertingPartyId, responseBinding, responseId, principalName, authNMethod, releasedAttributeID, nameIdentifier, assertionID

• props.conf (/opt/splunk/etc/system/local/props.conf)

[idp_access]
REPORT-parse_idp_access
SHOULD_LINEMERGE = false

[idp_audit]
REPORT-parse_idp_audit
SHOULD_LINEMERGE = false

[idp_process]
EXTRACT-idp_process1= (?TRACE|DEBUG|INFO|WARN|ERROR)\s[(?\S+?)]\s-\s*(?\S+?|(?\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})|(?\S+?)|(?.*)
SHOULD_LINEMERGE = false

Am I not using the right transforms/props conf files?

Thx

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...