Getting Data In

Duplicate files being indexed

eelisio2
Path Finder

Using the Unix App, monitoring Radius log files. /var/log/radius/radius.log Current log file gets renamed and gzipped. Splunk is indexing radius.log but also indexing radius.log-20101105 and radius.log-20101105.gz.

Suggestions? Thanks.

Tags (1)
0 Karma

Jason_1
New Member

That worked but only for the radius.log file. Modified the blacklist value to include a more generic form that also covered other log files in /var/log/ with the same naming convention (filename.log-<somedate>[.gz|.bz2])

_blacklist=(lastlog|(.log-\d{8}.*)$)

Used the command 'splunk list monitor | grep filename' to confirm the dated files were no longer being monitored, which seems to be the best way to test that the syntax is correct. Thanks for the help!

0 Karma

ziegfried
Influencer

Try blacklisting the rolled/gzipped logs:

$SPLUNK_HOME/etc/apps/unix/local/inputs.conf:

[monitor:///var/log]
disabled = 0
_blacklist = (lastlog|radius\.log.+)
Get Updates on the Splunk Community!

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

 Prepare to elevate your security operations with the powerful upgrade to Splunk Enterprise Security 8.x! This ...

Get Early Access to AI Playbook Authoring: Apply for the Alpha Private Preview ...

Passionate about security automation? Apply now to our AI Playbook Authoring Alpha private preview ...

Reduce and Transform Your Firewall Data with Splunk Data Management

Managing high-volume firewall data has always been a challenge. Noisy events and verbose traffic logs often ...