Hello Splunk Community,
I am having difficulty monitoring a local directory on my machine. The files are not getting updated. Instead, new files are being added to the directory. Is it possible set up a schedule that indexes all files in a directory? All files are .csv's and are formatted the same way (I.e. same columns in each file). I'd like for Splunk to "monitor" the directory for new files and immediately index them as a certain type under a certain host. Can this be done?
This is how I did similar thing
1) Created a scripted input which will run at a cron schedule, which will copy all the files from that folder to a new directory (accessible to splunk).
2) Created a 'batch' type monitoring input to monitor all the files in the new directory and delete it after that.
[batch://Path/to/new/directory] ...other settings... crcSalt = <SOURCE> move_policy = sinkhole
This will cause a duplicate copy of all files be available in the servers, so ensure you've ample space in the filesystem you'll be copying the files to.
What do you mean by "a new directory (accessible to Splunk)"? Is this in reference to a source where Splunk is monitoring a local directory? My main issue is that files added to a Directory being monitored are not being indexed and are not updating with my dashboards.