I have a folder being monitored by a UF. Three (3) xml files are generated by a software and placed at the same time into the folder which the UF is monitoring. These 3 xml files have different timestamps/creation times. How do I configure the forwarder to only forward the file with the latest timestamp to my Splunk instance?
The files I am monitoring are on a share and they have this structure:
The 3 xml files are placed in
folder5 at the same time.
Thanks in advance! 🙂
What I do in situations like this is to schedule a cron job on the box that calls a script that contains your desired logic for selecting the correct file. Then create a soft link to the appropriate file in a DIFFERENT directory that only contains the soft links. Have your UF monitor that soft link directory instead of the main directory. To conserve inodes, be sure to delete dead links so that when the original file is moved/deleted, you delete the dead soft link. Problem solved.
is there a date in the name of the file ?
* in the path structure contain dates in the format 2018_07_04.
I think the way you request is not impossible, just too much burden. In a similar cases, I had ingested all 3 ( assuming size is low) , and get rid of 2 of them in search. I can provide you a search to clear other 2.
in a way requesting, I don't see it is happening with built in mechanisms of Splunk. However, if you are on windows, you can enable file monitoring in the folder,
From same documentation:
You must enable security auditing for the file(s) or director(ies) you want Splunk Enterprise to monitor changes to
When this is set, you will be able to capture file from file monitoring via Splunk search, you can pass path of the file to your script that does one of the following:
if it is deployment server, it can update the inputs file resides in \deployment-apps\your_application_for_forwarder
if it is a heavy forwarder that access to share, you can also set the same using Splunk REST Api.
I hope it helps
Theres no way for the UF to do this. Your options are either to write a script that copies the latest one for you, or simply index all three files and pick the latest one during search.