I have installed Universal forwarder to send the log files to my Splunk storm project.
My question is how frequently the forwarder checks whether the log file is updated or not? Does universal forwarder continuously monitor the files or it does in some interval? Can I configure the interval?
let's say the orignal filename is /var/log/mylog.log and the rotated files /var/log/mylog.log.1 /var/log/mylog.log.2 /var/log/mylog.log.3.gz /var/log/mylog.log.4.gz ...
And let's imagine that Splunk reached the line 99 of the original file, then a new line is added, and the file rotate to mylog.log.1
the solution is to monitor all the files.
[monitor:///var/log/mylog*]
Splunk will check the crc of the first 256 chars and detects that a file is a rotated version of a file that was indexed.
Then will continue where it was (on line 99), while indexing the new file. Also Splunk can read compressed files.
In your case, if the files are backed up out of the server, and rotate very often, you may want to keep some first rotated versions on disk.
see details http://docs.splunk.com/Documentation/Splunk/5.0.1/Data/HowLogFileRotationIsHandled
please check the checkmark on the left to accept it, it will make it easier for other people to find relevant answers.
Thats exactly what I wanted and your answer solved it.
Thanks !!!
I want to monitor an application log file which is continuously being updated. Once the size of that file reach its max. limit, it gets copied to a back up file and is replaced by new blank file. This process is done in no time. I just want to make sure that all the content which is stored to the backed up file is read by the forwarder before its backed up.
Whats the best way to eunsure this?
The Splunk forwarders actively monitor the file in the scope of the paths. There is no interval to control it.
If the number of file is very large, it will have to cycle between them.
What is your goal exactly : monitor faster, or slower, or one time only ?
If the forwarding speed seems slow, you may want to increase the thruput speed (see http://docs.splunk.com/Documentation/Storm/latest/User/Setupauniversalforwarderonnix#Remove_the_defa... )
I want to monitor an application log file which is continuously being updated. Once the size of that file reach its max. limit, it gets copied to a back up file and is replaced by new blank file. This process is done in no time. I just want to make sure that all the content which is stored to the backed up file is read by the forwarder before its backed up.
Whats the best way to eunsure this?