<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Automatic removal of duplicate log entries in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137625#M28319</link>
    <description>&lt;P&gt;No because the events cannot be compared to each other before being indexed. You should stop one of the sources.&lt;/P&gt;

&lt;P&gt;At search time you can remove duplicated using "dedup" but this will not reduce your indexed volume.&lt;/P&gt;</description>
    <pubDate>Tue, 22 Apr 2014 01:38:41 GMT</pubDate>
    <dc:creator>yannK</dc:creator>
    <dc:date>2014-04-22T01:38:41Z</dc:date>
    <item>
      <title>Automatic removal of duplicate log entries</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137624#M28318</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;I currently have the following configuration: &lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;                       --&amp;gt; rsyslog server (with splunk forwarder) --
                     /                                               \
Many linux Servers --                                                 --&amp;gt; Splunk Indexer/Search Head
                     \                                               /
                       --&amp;gt; rsyslog server (with splunk forwarder) --
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;All Linux servers have their rsyslog clients configured to forward a copy of each log entry to both of the central rsyslog servers, thus the splunk forwarders are then forwarding both copies onto the Splunk Indexder which creates a duplicate entry for each event.  Given this setup is there any way of configuring Splunk to automatically remove the duplicate log entries this setup is generating (aside from disabling one of the splunk forwarders on one of the rsyslog servers)&lt;/P&gt;

&lt;P&gt;Cheers,&lt;BR /&gt;
Tom&lt;/P&gt;</description>
      <pubDate>Tue, 22 Apr 2014 00:18:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137624#M28318</guid>
      <dc:creator>tpride</dc:creator>
      <dc:date>2014-04-22T00:18:16Z</dc:date>
    </item>
    <item>
      <title>Re: Automatic removal of duplicate log entries</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137625#M28319</link>
      <description>&lt;P&gt;No because the events cannot be compared to each other before being indexed. You should stop one of the sources.&lt;/P&gt;

&lt;P&gt;At search time you can remove duplicated using "dedup" but this will not reduce your indexed volume.&lt;/P&gt;</description>
      <pubDate>Tue, 22 Apr 2014 01:38:41 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137625#M28319</guid>
      <dc:creator>yannK</dc:creator>
      <dc:date>2014-04-22T01:38:41Z</dc:date>
    </item>
    <item>
      <title>Re: Automatic removal of duplicate log entries</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137626#M28320</link>
      <description>&lt;P&gt;Thanks yannK,&lt;/P&gt;

&lt;P&gt;I pretty much expected that that would be the answer, but I needed to check because this is my first time using Splunk so I'm not up to speed on all of it's capabilities.&lt;/P&gt;

&lt;P&gt;Cheers,&lt;BR /&gt;
Tom&lt;/P&gt;</description>
      <pubDate>Tue, 22 Apr 2014 06:10:25 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Automatic-removal-of-duplicate-log-entries/m-p/137626#M28320</guid>
      <dc:creator>tpride</dc:creator>
      <dc:date>2014-04-22T06:10:25Z</dc:date>
    </item>
  </channel>
</rss>

