Getting Data In

Duplicate files being indexed

eelisio2
Path Finder

Using the Unix App, monitoring Radius log files. /var/log/radius/radius.log Current log file gets renamed and gzipped. Splunk is indexing radius.log but also indexing radius.log-20101105 and radius.log-20101105.gz.

Suggestions? Thanks.

Tags (1)
0 Karma

Jason_1
New Member

That worked but only for the radius.log file. Modified the blacklist value to include a more generic form that also covered other log files in /var/log/ with the same naming convention (filename.log-<somedate>[.gz|.bz2])

_blacklist=(lastlog|(.log-\d{8}.*)$)

Used the command 'splunk list monitor | grep filename' to confirm the dated files were no longer being monitored, which seems to be the best way to test that the syntax is correct. Thanks for the help!

0 Karma

ziegfried
Influencer

Try blacklisting the rolled/gzipped logs:

$SPLUNK_HOME/etc/apps/unix/local/inputs.conf:

[monitor:///var/log]
disabled = 0
_blacklist = (lastlog|radius\.log.+)
Get Updates on the Splunk Community!

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...

Purpose in Action: How Splunk Is Helping Power an Inclusive Future for All

At Cisco, purpose isn’t a tagline—it’s a commitment. Cisco’s FY25 Purpose Report outlines how the company is ...

[Upcoming Webinar] Demo Day: Transforming IT Operations with Splunk

Join us for a live Demo Day at the Cisco Store on January 21st 10:00am - 11:00am PST In the fast-paced world ...