Getting Data In

Is there a way to configure the Universal Forwarder to prevent duplicate events due to a log file that regenerates?

donaldlcho
New Member

I am using the universal forwarder to index a log file that regenerates every time that a new row is added. In other words, the logging mechanism rewrites the entire file periodically; it doesn't append rows to the previous file. The issue that I am having is that when new rows are added to the log file, the entire file is being re-indexed, which results in duplicate event rows. Is there a way to configure this file (in the inputs and/or props configuration files) to prevent this from happening? Thanks.

0 Karma

woodcock
Esteemed Legend

I would write shell script to do a delta of the "real file" and a "copy file" and then do echo $new_lines >> /other/directory/copy_file; rm -f /real/directory/real_file and have Splunk monitor /other/directory/copy_file.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...