Getting Data In

Forwarding and Indexing backlog

rparagas
Explorer

We experienced communication issues between the forwarders and the splunk server, say on the 2nd of November. Everything was back online after 3 days (say 5th Nov) and this resulted to the loss of data on 2 of our indexes.

The main reasons are:

1) On one stanza we have setup the ignoreOlderThan = 1d, and also the whitelist is specific to only watch sample.log files, on the 3 days the files have been aged to sample.log.1 sample.log.2 and so on (due to log4j.xml config).

2) The other stanza is monitoring a particular directory but in that 3 days of no forwarding, the log files have been aged and MOVED to a different directory, and also it is now .gz! This stanza also has the ignoreOlderThan = 1d parameter set.

Some part of the day of 2nd-Nov (from midnight until 2am) was indexed and nothing more after the issue happened.

QUESTIONS for solution:

1) Will I solve the issue just by working back the days that have passed from the 2nd of Nov by setting the ignoreOlderThan attribute to older than the number of days to look all the way back to that date? Also, because the log files have aged and have a different file names (sample.log.3 and sample.log.gz for example), will it still automagically pick up where it left off? So it ingests the rest of the 2nd-Nov logs and the days that follow?

2) On number two above, will I solve this by changing the monitor attribute to point to the directory where the aged files were moved? And, again will it ingest the logs but pick up where it left off? So it ingests the rest of the 2nd-Nov logs and the days that follow?

Thank you and looking forward to your responses!

0 Karma
1 Solution

kristian_kolb
Ultra Champion

1 & 2) Setting ignoreOlderThan to relevant value to include the missing *.log.n should work fine, but splunk will not necessarily recognize the files that have been zipped. For those it may be better to unzip to a temporary directory and set up a [monitor] there. Or perhaps make a onetime upload - either through the GUI or the CLI. Which approach to choose depends on the number of files you need to index.

So unzip first to a temp-dir, make a copy of the original [monitor] to reflect the temp-dir, set ignoreOlderThan to a higer value, and it should pick up where it left off.

You might want to do that one file at a time, if you want to preserve the source attribute. Well, the path would be different, but at least the filenames would be *.log and not *.log.n

/K

View solution in original post

0 Karma

kristian_kolb
Ultra Champion

1 & 2) Setting ignoreOlderThan to relevant value to include the missing *.log.n should work fine, but splunk will not necessarily recognize the files that have been zipped. For those it may be better to unzip to a temporary directory and set up a [monitor] there. Or perhaps make a onetime upload - either through the GUI or the CLI. Which approach to choose depends on the number of files you need to index.

So unzip first to a temp-dir, make a copy of the original [monitor] to reflect the temp-dir, set ignoreOlderThan to a higer value, and it should pick up where it left off.

You might want to do that one file at a time, if you want to preserve the source attribute. Well, the path would be different, but at least the filenames would be *.log and not *.log.n

/K

0 Karma

wrangler2x
Motivator

Splunk will recognize files that have been zipped. I had an ignoreOlderThan=1d set on one of my inputs. The log rotate program on the system the forwarder was on simply rotated the log out, but did not compress it. One day the sysadmin for this system compressed all the old log files, and because they had a new name and a current modtime the forwarder just sucked them all up!

0 Karma

rparagas
Explorer

thanks for your help! we are now considering the onetime upload as you've recommended. lets see how we go!

cheers,
rex

0 Karma
Get Updates on the Splunk Community!

Cisco Use Cases, ITSI Best Practices, and More New Articles from Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Build Your First SPL2 App!

Watch the recording now!.Do you want to SPL™, too? SPL2, Splunk's next-generation data search and preparation ...

Exporting Splunk Apps

Join us on Monday, October 21 at 11 am PT | 2 pm ET!With the app export functionality, app developers and ...