Getting Data In

File monitoring in inputs.conf

Boopalan
New Member

I want to configure an file in a directory which will be rolling over to new file within 2mins.
I tried basic inputs.conf as below, its working fine but its missing files which was rolled in to new For example, test.log is the file I want to continuously monitor, this test.log will be renamed as test-1.log within 2 mins and new datas will be written in test.log. My config is monitoring test.log once and after 6mins only test.log is again reading i.e., in between test-2.log created in 4th min and test-3.log in 6th min is ignored. I want to configure to monitor only test.log without any loss of data on it.
Note: logs are placed in *nix systems

inputs.conf used:

[monitor:///opt/sample/logs/test*.log]
index = test
disabled = false
sourcetype = test_logs
blacklist = (test*-\d{1,2}\.log$)
ignoreOlderThan = 30d
crcSalt = <SOURCE>
0 Karma

woodcock
Esteemed Legend

Like this:

[monitor:///opt/sample/logs/test*.log]
index = test
disabled = false
sourcetype = test_logs
blacklist = (test*-\d{2,}\.log$)

DEFINITELY DO NOT USE THESE:

ignoreOlderThan = 30d
crcSalt = <SOURCE>
0 Karma

somesoni2
Revered Legend

Try this

[monitor:///opt/sample/logs/test*.log]
 index = test
 sourcetype = test_logs
 ignoreOlderThan = 30d
0 Karma
Get Updates on the Splunk Community!

Customer Experience | Splunk 2024: New Onboarding Resources

In 2023, we were routinely reminded that the digital world is ever-evolving and susceptible to new ...

Celebrate CX Day with Splunk: Take our interactive quiz, join our LinkedIn Live ...

Today and every day, Splunk celebrates the importance of customer experience throughout our product, ...

How to Get Started with Splunk Data Management Pipeline Builders (Edge Processor & ...

If you want to gain full control over your growing data volumes, check out Splunk’s Data Management pipeline ...