I am looking to monitor a folder audit that contains list of files which gets generated everyday automatically, below is how audit directory looks like(Below are the file names):
Windows Sample:
I am looking to only monitor the latest file and index the logs inside the file but I am not sure how to achieve this? Any help would be appreciated.
If you tell Splunk to monitor a directory then it will read whatever file it finds in that directory (subject to allow/deny lists, etc.). Once a file has been read, Splunk will not read it again unless it has been modified. As new files are added, Splunk will read it and disregard those it's already read.
@richgalloway What should be my monitoring stanza ? currently i am using the below one yet i am not able to see logs in my index(I am using linux UF to forward the logs all the ports are open and i am able to receive logs from other file destinations).
[monitor:///opt/splunkforwarder/etc/apps/error_log/audit/*.log]
index = test
sourcetype = test
disabled = 0
Hi @ashvinpandey at first i thought you wanted to do this task on a windows host, my bad,..
on a linux host, you can write a simple shell script which will do this task for you.
the question is... do you want to read only the latest file only or all files with the latest content(not the old contents), .. are there any file rotations happening?
I am looking to index the content of all the files with latest data but not the already indexed data.
Also some new files would also be created on daily basis that must also get indexed.
[monitor:///opt/splunkforwarder/etc/apps/error_log/audit/*.log]
index = test
sourcetype = test
disabled = 0
this should be enough actually.
if you are concerned about re-reading old logs then, maybe, you could add "ignoreOlderThan = 1d" which will say to the UF that do not read logs older than 1day.
As mentioned by other two, what use-case you are mentioning is a "normal" method of how Splunk UF functions. Splunk UF will read only the new logs. it will not read the old logs. when we add *.log, if a new file gets created, UF will start reading it and add a pointer to remember at which position it is currently reading.
Troubleshooting -
1. after adding inputs.conf file, did you restart the Splunk Service on the UF? if not, pls do a Splunk service restart.
2. after restart also if you do not see the required logs, then, do you see any errors at the internal logs?
3. by using the "btool", you can verify if the Splunk UF understood your inputs.conf.
let us know how your troubleshooting goes, thanks.
Hi
as @richgalloway already told, this is the normal behaviour of splunk. It reads files only once (unless your will remove/reset fishbucket which keeps status for what have read). You should create own TA for every app from where you are collecting logs from source host. Then just deploy those with DS (Deployment Server) or your favourite configuration deployment tool.
Here is examples how to create that stanza into inputs.conf https://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf#inputs.conf.example
There are also many ready splunk TA's for collecting and analysing logs from different sources at https://splunkbase.splunk.com, just look from there with your app name.
r. Ismo