Archive

How to configure parameter for TailingProcessor to automatically retry reading a file after failing?

Path Finder

It took a while to copy the large logfile over the slow network, and it looks like the TailingProcessor gave up after 2 tries:


09-06-2014 15:17:03.607 -0700 WARN FileClassifierManager - Unable to open 'D:\Logs\syslog2014-09-05'.
09-06-2014 15:17:03.607 -0700 WARN FileClassifierManager - The file 'D:\Logs\syslog2014-09-05' is invalid. Reason: cannot_read
09-06-2014 15:17:03.607 -0700 INFO TailingProcessor - Ignoring file 'D:\Logs\syslog2014-09-05' due to: cannot_read
09-06-2014 15:22:47.707 -0700 WARN FileClassifierManager - Unable to open 'D:\Logs\syslog2014-09-05'.
09-06-2014 15:22:47.707 -0700 WARN FileClassifierManager - The file 'D:\Logs\syslog2014-09-05' is invalid. Reason: cannot_read
09-06-2014 15:22:47.707 -0700 INFO TailingProcessor - Ignoring file 'D:\Logs\syslog2014-09-05' due to: cannot_read

Any way to make it automatically retry a couple more times (via some configuration parameter somewhere)?
Using Splunk v6.1.3 for Windows.

0 Karma

Legend

Before ask Splunk to do this more times, have you checked:

  1. Does the file exist?
  2. Splunk is running under some user credentials. Is this file accessible by that user?
  3. Does another process have the file locked so that Splunk cannot read it?
0 Karma

Legend

Is there any way to set the copy process so that that it doesn't lock the file?

Is it possible to forward the file from its original location?

0 Karma

Path Finder
  1. Yes.
  2. Yes.
  3. Yes, during the slow copy process.

If the file was small and/or the copy process was fast, Splunk indexes the file fine.

0 Karma