Getting Data In
Highlighted

Why is our Splunk 6.3.2 forwarder locking itself out of a catalina.out archive file every morning?

New Member

Every morning the Splunk forwarder on our servers locks itself out of a file and consumes quite a bit of CPU churning over and over trying to access it. Moving the file out of the reach of Splunk resolves the problem.

Splunk Forwarder Version
6.3.2

lsof command showing lock on file

splunkd 19690 root 53r REG 202,1 7718 59481 /var/log/tomcat/catalina.out-20160217.gz

Spam from log file

02-18-2016 13:14:57.344 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:14:57.344 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:14:57.444 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:14:58.445 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:14:58.445 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:14:58.543 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:14:59.544 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:14:59.544 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:14:59.645 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:15:00.646 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:15:00.646 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:15:00.748 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:15:01.748 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:15:01.749 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:15:01.847 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
02-18-2016 13:15:02.848 -0800 INFO  ArchiveProcessor - Handling file=/var/log/tomcat/catalina.out-20160217.gz
02-18-2016 13:15:02.848 -0800 INFO  ArchiveProcessor - reading path=/var/log/tomcat/catalina.out-20160217.gz (seek=0 len=7718)
02-18-2016 13:15:02.945 -0800 WARN  ArchiveProcessor -   Could not acesses archieve=/var/log/tomcat/catalina.out-20160217.gz, because:retrieve: key=0x418de5d67f9dc883 is already locked with state=0x7fbc6259f800
0 Karma
Highlighted

Re: Why is our Splunk 6.3.2 forwarder locking itself out of a catalina.out archive file every morning?

Splunk Employee
Splunk Employee

This is a known issue which is fixed in 6.3.3.

SPL-108219: File deadlock. Can cause repeated entry in splunkd.log which contains, "is already locked with state=". In certain conditions, deadlock can result in elevated CPU usage.

6.3.3 Release notes: http://docs.splunk.com/Documentation/Splunk/6.3.3/ReleaseNotes/6.3.3#Unsorted_issues . Please upgrade and see if that resolves the issue.

Jacob
Sr. Technical Support Engineer