<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Ignore duplicate events while indexing for circular log in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182749#M36617</link>
    <description>&lt;P&gt;Hi, I have installed a splunk server in one system and one universal forwarder in another system. I am monitoring a circular log(when the log reaches a particular size the log entries from the bottom gets deleted and new entries comes at the top of the file) with universal forwarder.&lt;/P&gt;

&lt;P&gt;The problem i am facing is that when any new entry comes at the top of the file the splunk parses the whole log file and creates duplicate events in the index.&lt;/P&gt;

&lt;P&gt;Is there any way so  that  i can tell splunk to ignore a event if it is already present in the index  or to overwrite a event if it is already in the index and only store the new entry in the index.&lt;/P&gt;

&lt;P&gt;I am working on it for last 2 days but no solution till now, any help is appreciated.&lt;/P&gt;

&lt;P&gt;Thanx,&lt;/P&gt;

&lt;P&gt;SD&lt;/P&gt;</description>
    <pubDate>Wed, 18 Dec 2013 09:07:39 GMT</pubDate>
    <dc:creator>sanjibdhar</dc:creator>
    <dc:date>2013-12-18T09:07:39Z</dc:date>
    <item>
      <title>Ignore duplicate events while indexing for circular log</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182749#M36617</link>
      <description>&lt;P&gt;Hi, I have installed a splunk server in one system and one universal forwarder in another system. I am monitoring a circular log(when the log reaches a particular size the log entries from the bottom gets deleted and new entries comes at the top of the file) with universal forwarder.&lt;/P&gt;

&lt;P&gt;The problem i am facing is that when any new entry comes at the top of the file the splunk parses the whole log file and creates duplicate events in the index.&lt;/P&gt;

&lt;P&gt;Is there any way so  that  i can tell splunk to ignore a event if it is already present in the index  or to overwrite a event if it is already in the index and only store the new entry in the index.&lt;/P&gt;

&lt;P&gt;I am working on it for last 2 days but no solution till now, any help is appreciated.&lt;/P&gt;

&lt;P&gt;Thanx,&lt;/P&gt;

&lt;P&gt;SD&lt;/P&gt;</description>
      <pubDate>Wed, 18 Dec 2013 09:07:39 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182749#M36617</guid>
      <dc:creator>sanjibdhar</dc:creator>
      <dc:date>2013-12-18T09:07:39Z</dc:date>
    </item>
    <item>
      <title>Re: Ignore duplicate events while indexing for circular log</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182750#M36618</link>
      <description>&lt;P&gt;Hello&lt;/P&gt;

&lt;P&gt;I think that the only solution is modify the way the log is written. The UF keeps a hash of the header of the file to mark that file as "known", so if by any means the header of the file is modified, the UF will reindex the whole file from the beggining. &lt;/P&gt;

&lt;P&gt;Regards&lt;/P&gt;</description>
      <pubDate>Wed, 18 Dec 2013 16:12:39 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182750#M36618</guid>
      <dc:creator>gfuente</dc:creator>
      <dc:date>2013-12-18T16:12:39Z</dc:date>
    </item>
    <item>
      <title>Re: Ignore duplicate events while indexing for circular log</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182751#M36619</link>
      <description>&lt;P&gt;We cannot change the way the log is written, i am surprised that splunk is unable give solution for that. If splunk does not handle circular log then there should be a feature   request for this implementation.&lt;/P&gt;</description>
      <pubDate>Thu, 19 Dec 2013 04:44:11 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182751#M36619</guid>
      <dc:creator>sanjibdhar</dc:creator>
      <dc:date>2013-12-19T04:44:11Z</dc:date>
    </item>
    <item>
      <title>Re: Ignore duplicate events while indexing for circular log</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182752#M36620</link>
      <description>&lt;P&gt;Hello, As far as i know it works that way. &lt;BR /&gt;
So, if you are an Enterprise Customer you should fill a P4 case with your feature request.&lt;/P&gt;</description>
      <pubDate>Thu, 19 Dec 2013 09:21:21 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Ignore-duplicate-events-while-indexing-for-circular-log/m-p/182752#M36620</guid>
      <dc:creator>gfuente</dc:creator>
      <dc:date>2013-12-19T09:21:21Z</dc:date>
    </item>
  </channel>
</rss>

