Dear all,
I have the use case that my splunk universal forwarder does not continuously monitor my logs.
Because of this nature, I am using batch mode to have the files deleted after ingestion.
Now, I occasionally receive log files which I have already received at an earlier point in time.
Problem is: The features crcSalt, initCrcLength etc. are only available in monitor mode. This means that I am not able to benefit from splunks features to prevent duplicate ingestion of the same data.
Any help on a solution for this is greatly appreciated.
I'd try writing some external "helper" script which keeps track of files.
But the question is why don't you use monitor input? Unless you absolutely need the sinkholing functionality and can't get around it another way (like logrotate or such).
Hi @PickleRick
Thanks for your reply.
I think also that keeping track of files is something that I will have to implement myself.
I was just hoping to be able to use what splunk has.
On why not using monitor inputs:
I have a http endpoint that receives log files from another system and extracts them to disk, where the forwarder then picks them up. I could use monitor mode, but because there is no log rotation or similar it will ultimately result in filling up the disk.
What is charming about batch mode is that it ensures that the disk space is free again after a new file has been completely ingested.
Yeah, but you end up with duplicates 🙂
If you can assure that there is a maximum possible period for duplicate creation, you could get away with monitor input and external script to clean up the directory of files older than given time. - that could be an alternative approach (probably easier to implement).
Hehe 🙂
Alright, thanks for your input!