Installation

4.1.7 upgrade from 4.1.4 new error message "Failed to rm dir"

Explorer

After upgrading from splunk 4.1.4 to splunk 4.1.7 today, I was checking the health of the upgrade and noticed the following error being logged in the search 'index="_internal" source="*splunkd.log" log_level="error"'

02-11-2011 17:14:47.799 ERROR Timeliner - Failed to rm dir /splunk/splunk/var/run/splunk/dispatch/searchparsetmp_2122360156/buckets: No such file or directory

Looking back over the last 30 days, "Timeliner" has only started appearing in those logs since the upgrade and it has logged this error (for a variety of different directories) over 100 times in the 30 minutes since the upgrade.

Should I be concerned?

Tags (2)
1 Solution

Communicator

Hi,

This is related to SPL-35722. This is a harmless message caused by a lack of checking if the directory actually existed before trying to remove it.
It has been fixed with version 4.1.8, in fact you can find an entry in the release notes page: ReleaseNotes 4.1.8

  • Annoying but harmless "ERROR Timeliner - Failed to rm dir" (SPL-35722)

-- Jens

View solution in original post

Communicator

Hi,

This is related to SPL-35722. This is a harmless message caused by a lack of checking if the directory actually existed before trying to remove it.
It has been fixed with version 4.1.8, in fact you can find an entry in the release notes page: ReleaseNotes 4.1.8

  • Annoying but harmless "ERROR Timeliner - Failed to rm dir" (SPL-35722)

-- Jens

View solution in original post

Contributor

Has there ever been any insight from Splunk officially on this? Running 4.1.7 still and get these pretty consistently throughout the day. Permissions on the directory tree is not an issue.

0 Karma

Path Finder

Same problem on a clean install of 4.1.7. Splunk running under a non-root account called splunk, but I ran chown -R splunk /opt/splunk...blah blah blah.

0 Karma

Engager

I am also getting a lot of these errors in my splunkd.log since upgrading from 4.1.6 to 4.1.7. Our splunk is running as root.

0 Karma

New Member

I am having the same problem. I am not sure, but errors seem to have begun after I experienced a power outage.

0 Karma

Splunk Employee
Splunk Employee

Are you running as the same user, or have some files been created with a different owner from what you are running as? If so, you can try doing (as root) "chown -R splunkowner /opt/splunk" or whatever over the directory tree(s) where the files are located to fix this. Stop Splunk, change the owner, than start Splunk back up.

0 Karma

Explorer

As far as I can tell, the directories don't exist on the system.

Are you suggesting that I should file a bug, or is that something you can do? (I'm happy to do it, but don't know how).

0 Karma

Splunk Employee
Splunk Employee

I guess I would simply file this bug. Alternatively, if you're comfortable deleting some of the affected folders, you can just go ahead and do that. These folder (in var/run/splunk/dispatch) are saved and temporary Splunk search results, so if you don't need the job results, you can delete them.

0 Karma

Explorer

We're running splunk as root, but I did have a look and noticed that there were a number of files owned by uid 508 (but none of the directories), so I figured it was probably good practice to correct those anyway. I did as you suggested and stopped splunk, chowned the dir recursively and started splunk back up.

It doesn't appear to have helped, the same errors are still being logged (but I suspect you didn't expect it to help when running as root anyway).

Thanks for the suggestion.

0 Karma

Path Finder

Me too... recently upgraded to 4.1.7

Drat, the splunk-launch.conf had splunk_home set wrong 😞 That would toss a wrench in the works. I'm surprised anything worked. I magically seems to work WAY better now in general 😉

The errors remain with the paths fixed up. The folders it's complaining about are not there to delete. How do I reset the buckets to forget about these old temp files? The paths seem to be buried in the actual database.

0 Karma

Splunk Employee
Splunk Employee

Generally you should not need to set SPLUNK_HOME at all. Splunk can figure out the correct home and set it according to the location it is launched from. I never set it in the environment, and recommend you do not.

Contributor

I'm seeing the same behavior. Also on 4.1.7 now.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!