Getting Data In

Forwarder keeps input duplicate event

slasyang
Explorer

Hi,

 

I have a log server with universal forwarder and some Linux server,

and I set a cronjob to make those Linux server upload their /var/log/secure and /var/log/messages to log server every 10 mins, and universal forwarder will monitor them.

 

But every time, when linux server upload their log files to log  server,

universal forwarder will index not only difference part but entire files,

and it caused a lot of waste of license.

here is my inputs.conf

--

[monitor://D:\Log\Linux\*\messages]
sourcetype = message
index = os
host_segment = 3

--

How can I fix it?

 

Labels (2)
0 Karma
1 Solution

jwalthour
Communicator

Except that you’re not just adding events to an existing file; you’re recreating the file each time you overwrite it. I’d need to know more details, but something about what you’re doing triggers Splunk to see it as a new file. I’d advise consulting this troubleshooting guide below to figure out why Splunk is seeing it as a new file and, possibly, how to prevent it.

https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs

I’d also recommend you get a proper syslog server set up to accomplish this task in the best way possible.

If this solved your problem, please mark it as the solution.

View solution in original post

jwalthour
Communicator

How are you sending the Linux logs to your Windows log server? What exactly is the content of your cron job on your Linux servers?

0 Karma

slasyang
Explorer

First, It copy secure log and messages log to a temp folder,

then use FTP command to PUT the file to windows log server.

0 Karma

jwalthour
Communicator

And each time you FTP the entire log file to the Windows log server or just the additions?

0 Karma

slasyang
Explorer

I FTP the entire log file to overwrite the last one.

In my cognition,

forwarder will continue to index event from the end of the last file,

not from the first line.

0 Karma

jwalthour
Communicator

Except that you’re not just adding events to an existing file; you’re recreating the file each time you overwrite it. I’d need to know more details, but something about what you’re doing triggers Splunk to see it as a new file. I’d advise consulting this troubleshooting guide below to figure out why Splunk is seeing it as a new file and, possibly, how to prevent it.

https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs

I’d also recommend you get a proper syslog server set up to accomplish this task in the best way possible.

If this solved your problem, please mark it as the solution.

slasyang
Explorer

Thanks for your recommendation,

I think I understand  the cause of this problem.

I'll try another way to index those log file.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...