Getting Data In

Monitoring Latest File Only

ashvinpandey
Contributor

I am looking to monitor a folder audit that contains list of files which gets generated everyday automatically, below is how audit directory looks like(Below are the file names):

  • Activity_Engine_2021-12-18T14.51.04Z
  • Activity_Engine_2021-12-19T02.53.38Z
  • Activity_Engine_2021-12-19T15.00.28Z
  • Activity_Engine_2021-12-20T03.00.30Z

Windows Sample: 

ashvinpandey_0-1640756616387.png
I am looking to only monitor the latest file and index the logs inside the file but I am not sure how to achieve this? Any help would be appreciated.

Labels (4)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

If you tell Splunk to monitor a directory then it will read whatever file it finds in that directory (subject to allow/deny lists, etc.).  Once a file has been read, Splunk will not read it again unless it has been modified.  As new files are added, Splunk will read it and disregard those it's already read.

---
If this reply helps you, Karma would be appreciated.
0 Karma

ashvinpandey
Contributor

@richgalloway What should be my monitoring stanza ? currently i am using the below one yet i am not able to see logs in my index(I am using linux UF to forward the logs all the ports are open and i am able to receive logs from other file destinations).

[monitor:///opt/splunkforwarder/etc/apps/error_log/audit/*.log]
index = test
sourcetype = test
disabled = 0

 

0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @ashvinpandey at first i thought you wanted to do this task on a windows host, my bad,..

on a linux host, you can write a simple shell script which will do this task for you. 

 

the question is... do you want to read only the latest file only or all files with the latest content(not the old contents), .. are there any file rotations happening?

0 Karma

ashvinpandey
Contributor

I am looking to index the content of all the files with latest data but not the already indexed data.

Also some new files would also be created on daily basis that must also get indexed.

0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @ashvinpandey 

[monitor:///opt/splunkforwarder/etc/apps/error_log/audit/*.log]
index = test
sourcetype = test
disabled = 0

this should be enough actually. 

if you are concerned about re-reading old logs then, maybe, you could add "ignoreOlderThan = 1d" which will say to the UF that do not read logs older than 1day. 

As mentioned by other two, what use-case you are mentioning is a "normal" method of how Splunk UF functions. Splunk UF will read only the new logs. it will not read the old logs. when we add *.log, if a new file gets created, UF will start reading it and add a pointer to remember at which position it is currently reading. 

Troubleshooting -  
1. after adding inputs.conf file, did you restart the Splunk Service on the UF? if not, pls do a Splunk service restart. 
2. after restart also if you do not see the required logs, then, do you see any errors at the internal logs?

3. by using the "btool", you can verify if the Splunk UF understood your inputs.conf. 

let us know how your troubleshooting goes, thanks. 

 

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

as @richgalloway already told, this is the normal behaviour of splunk. It reads files only once (unless your will remove/reset fishbucket which keeps status for what have read). You should create own TA for every app from where you are collecting logs from source host. Then just deploy those with DS (Deployment Server) or your favourite configuration deployment tool. 

Here is examples how to create that stanza into inputs.conf https://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf#inputs.conf.example

There are also many ready splunk TA's for collecting and analysing logs from different sources at https://splunkbase.splunk.com, just look from there with your app name.

r. Ismo

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...