Getting Data In

Duplicate files being indexed

eelisio2
Path Finder

Using the Unix App, monitoring Radius log files. /var/log/radius/radius.log Current log file gets renamed and gzipped. Splunk is indexing radius.log but also indexing radius.log-20101105 and radius.log-20101105.gz.

Suggestions? Thanks.

Tags (1)
0 Karma

Jason_1
New Member

That worked but only for the radius.log file. Modified the blacklist value to include a more generic form that also covered other log files in /var/log/ with the same naming convention (filename.log-<somedate>[.gz|.bz2])

_blacklist=(lastlog|(.log-\d{8}.*)$)

Used the command 'splunk list monitor | grep filename' to confirm the dated files were no longer being monitored, which seems to be the best way to test that the syntax is correct. Thanks for the help!

0 Karma

ziegfried
Influencer

Try blacklisting the rolled/gzipped logs:

$SPLUNK_HOME/etc/apps/unix/local/inputs.conf:

[monitor:///var/log]
disabled = 0
_blacklist = (lastlog|radius\.log.+)
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...