Getting Data In

Forwarder keeps input duplicate event

slasyang
Explorer

Hi,

 

I have a log server with universal forwarder and some Linux server,

and I set a cronjob to make those Linux server upload their /var/log/secure and /var/log/messages to log server every 10 mins, and universal forwarder will monitor them.

 

But every time, when linux server upload their log files to log  server,

universal forwarder will index not only difference part but entire files,

and it caused a lot of waste of license.

here is my inputs.conf

--

[monitor://D:\Log\Linux\*\messages]
sourcetype = message
index = os
host_segment = 3

--

How can I fix it?

 

Labels (2)
0 Karma
1 Solution

jwalthour
Communicator

Except that you’re not just adding events to an existing file; you’re recreating the file each time you overwrite it. I’d need to know more details, but something about what you’re doing triggers Splunk to see it as a new file. I’d advise consulting this troubleshooting guide below to figure out why Splunk is seeing it as a new file and, possibly, how to prevent it.

https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs

I’d also recommend you get a proper syslog server set up to accomplish this task in the best way possible.

If this solved your problem, please mark it as the solution.

View solution in original post

jwalthour
Communicator

How are you sending the Linux logs to your Windows log server? What exactly is the content of your cron job on your Linux servers?

0 Karma

slasyang
Explorer

First, It copy secure log and messages log to a temp folder,

then use FTP command to PUT the file to windows log server.

0 Karma

jwalthour
Communicator

And each time you FTP the entire log file to the Windows log server or just the additions?

0 Karma

slasyang
Explorer

I FTP the entire log file to overwrite the last one.

In my cognition,

forwarder will continue to index event from the end of the last file,

not from the first line.

0 Karma

jwalthour
Communicator

Except that you’re not just adding events to an existing file; you’re recreating the file each time you overwrite it. I’d need to know more details, but something about what you’re doing triggers Splunk to see it as a new file. I’d advise consulting this troubleshooting guide below to figure out why Splunk is seeing it as a new file and, possibly, how to prevent it.

https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs

I’d also recommend you get a proper syslog server set up to accomplish this task in the best way possible.

If this solved your problem, please mark it as the solution.

slasyang
Explorer

Thanks for your recommendation,

I think I understand  the cause of this problem.

I'll try another way to index those log file.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...