Getting Data In

Is there a way to configure the Universal Forwarder to prevent duplicate events due to a log file that regenerates?

donaldlcho
New Member

I am using the universal forwarder to index a log file that regenerates every time that a new row is added. In other words, the logging mechanism rewrites the entire file periodically; it doesn't append rows to the previous file. The issue that I am having is that when new rows are added to the log file, the entire file is being re-indexed, which results in duplicate event rows. Is there a way to configure this file (in the inputs and/or props configuration files) to prevent this from happening? Thanks.

0 Karma

woodcock
Esteemed Legend

I would write shell script to do a delta of the "real file" and a "copy file" and then do echo $new_lines >> /other/directory/copy_file; rm -f /real/directory/real_file and have Splunk monitor /other/directory/copy_file.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

What Is Splunk? Here’s What You Can Do with Splunk

Hey Splunk Community, we know you know Splunk. You likely leverage its unparalleled ability to ingest, index, ...

Level Up Your .conf25: Splunk Arcade Comes to Boston

With .conf25 right around the corner in Boston, there’s a lot to look forward to — inspiring keynotes, ...

Manual Instrumentation with Splunk Observability Cloud: How to Instrument Frontend ...

Although it might seem daunting, as we’ve seen in this series, manual instrumentation can be straightforward ...