Splunk Search

Parsing ldap access and error logs

croomes
Engager

Hi all, just curious if anyone can give me a head-start. I'd like to use Splunk to parse Sun's Directory Server access and error logs. I was hoping there would be a pretrained sourcetype or app but haven't found anything.

I can get some very useful info out by default, but I'm also interested in tracking heaviest clients by connections, ops per connection and total elapsed time for ops (etime). When the result includes notes="U", I'd like to know the search's base, filter and client ip.

The access log entries for a single connection looks like this:

[29/Jan/2011:06:35:03 +0000] conn=13624327 op=-1 msgId=-1 - fd=49 slot=49 LDAP connection from 10.0.0.2 to 10.0.0.1
[29/Jan/2011:06:35:03 +0000] conn=13624327 op=0 msgId=1 - SRCH base="" scope=0 filter="(objectClass=*)" attrs=ALL
[29/Jan/2011:06:35:03 +0000] conn=13624327 op=0 msgId=1 - RESULT err=0 tag=101 nentries=1 etime=0
[29/Jan/2011:06:35:03 +0000] conn=13624327 op=1 msgId=2 - BIND dn="" method=128 version=2
[29/Jan/2011:06:35:03 +0000] conn=13624327 op=1 msgId=2 - RESULT err=0 tag=97 nentries=0 etime=0 dn=""
[29/Jan/2011:06:35:03 +0000] conn=13624327 op=2 msgId=3 - SRCH base="ou=applications,dc=company,dc=com" scope=2 filter="(&(objectClass=application)(systemName=app1))" attrs=ALL
[29/Jan/2011:06:35:03 +0000] conn=13624327 op=2 msgId=3 - RESULT err=0 tag=101 nentries=1 etime=0
<more ops snipped>
[29/Jan/2011:06:35:30 +0000] conn=13624327 op=21 msgId=22 - SRCH base="ou=applications,dc=company,dc=com" scope=0 filter="(objectClass=*)" attrs="nodeName description host instanceName"
[29/Jan/2011:06:35:30 +0000] conn=13624327 op=21 msgId=22 - RESULT err=0 tag=101 nentries=1889 etime=23 notes=U
[29/Jan/2011:06:35:30 +0000] conn=13624327 op=-1 msgId=-1 - closing - B1
[29/Jan/2011:06:35:30 +0000] conn=13624327 op=-1 msgId=-1 - closed.

Can anyone give me any pointers?

Thanks in advance! Simon

tskinnerivsec
Contributor

If you are looking to parse the logs, a good reference for the openldap debug log format is:

https://www.centos.org/docs/5/html/CDS/cli/8.0/Configuration_Command_File_Reference-Access_Log_and_C...

I'm currently using the information on these pages to help me put together a Technology Addon for openldap, since I didn't see one out there to work with log data on a project I'm currently on.

0 Karma

JensT
Communicator

@tskinnerivsec: Did you continue on this?

0 Karma

David
Splunk Employee
Splunk Employee

I think the search operator that will enable you to get all those excellent details is going to be transaction. Transaction will essentially pull out all the events for a particular field. So you can probably do interesting searches with a base of:

YourSearch | transaction conn maxpause=5m

which will bundle all the events for a particular conn and give you duration and eventcount fields.

I'm guessing that if you have a large number of overlapping events, most short and some very long, transaction is probably going to slow down your search (and consume a lot of memory). You can control some of that with parameters such as maxpause (if you're not going to have an hour of silence before the same connection continues being used), maxspan, etc. But some events you'll be able to get useful information from without having to use transaction. For example, to find out how many ops per connection you could run:

YourSearch | stats max(op) by conn

One other potentially useful thing you can do with transaction is to run a transaction on manufactured variables, such as the following:

YourSearch | eval OpID = conn + "-" + op | transaction OpID | search NOT err=0

I think that might be a good way to get your notes="U" search as well.

Note that Transaction has a default max lines of 500, but you can specify a higher max with maxevents=1000 or what have you. I haven't tried it myself, but you might get into the territory of having to play with limits.conf, if you have some very long-lasting connections.

jamezpolley
Engager

Fantastic. This answer has made me suddenly decide that splunk was worth the cost.

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

May 2026 Splunk Expert Sessions: Security & Observability

Level Up Your Operations: May 2026 Splunk Expert Sessions Whether you are refining your security posture or ...

Network to App: Observability Unlocked [May & June Series]

In today’s digital landscape, your environment is no longer confined to the data center. It spans complex ...

SPL2 Deep Dives, AppDynamics Integrations, SAML Made Simple and Much More on Splunk ...

Splunk Lantern is Splunk’s customer success center that provides practical guidance from Splunk experts on key ...