<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic JSON events duplicate extractions in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/JSON-events-duplicate-extractions/m-p/579416#M102290</link>
    <description>&lt;P&gt;I've got a log file that I am monitoring and where I am using a props.conf on the UF to monitor. I'm using the following settings:&lt;/P&gt;&lt;P&gt;UF - props.conf:&lt;/P&gt;&lt;P&gt;[my_sourcetype]&lt;/P&gt;&lt;P&gt;INDEXED_EXTRACTIONS = JSON&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Search head cluster&amp;nbsp; (via deployer in an app bundle) props.conf:&lt;/P&gt;&lt;P&gt;[my_sourcetype]&lt;/P&gt;&lt;P&gt;KV_MODE = NONE&lt;BR /&gt;AUTO_KV_JSON = FALSE&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If I run btool on one of the search heads for that sourcetype I get:&lt;BR /&gt;ADD_EXTRA_TIME_FIELDS = True&lt;BR /&gt;ANNOTATE_PUNCT = True&lt;BR /&gt;AUTO_KV_JSON = FALSE&lt;BR /&gt;BREAK_ONLY_BEFORE =&lt;BR /&gt;BREAK_ONLY_BEFORE_DATE = True&lt;BR /&gt;CHARSET = UTF-8&lt;BR /&gt;DATETIME_CONFIG = /etc/datetime.xml&lt;BR /&gt;DEPTH_LIMIT = 1000&lt;BR /&gt;DETERMINE_TIMESTAMP_DATE_WITH_SYSTEM_TIME = false&lt;BR /&gt;HEADER_MODE =&lt;BR /&gt;KV_MODE = NONE&lt;BR /&gt;LB_CHUNK_BREAKER_TRUNCATE = 2000000&lt;BR /&gt;LEARN_MODEL = true&lt;BR /&gt;LEARN_SOURCETYPE = true&lt;BR /&gt;LINE_BREAKER_LOOKBEHIND = 100&lt;BR /&gt;MATCH_LIMIT = 100000&lt;BR /&gt;MAX_DAYS_AGO = 2000&lt;BR /&gt;MAX_DAYS_HENCE = 2&lt;BR /&gt;MAX_DIFF_SECS_AGO = 3600&lt;BR /&gt;MAX_DIFF_SECS_HENCE = 604800&lt;BR /&gt;MAX_EVENTS = 256&lt;BR /&gt;MAX_TIMESTAMP_LOOKAHEAD = 128&lt;BR /&gt;MUST_BREAK_AFTER =&lt;BR /&gt;MUST_NOT_BREAK_AFTER =&lt;BR /&gt;MUST_NOT_BREAK_BEFORE =&lt;BR /&gt;SEGMENTATION = indexing&lt;BR /&gt;SEGMENTATION-all = full&lt;BR /&gt;SEGMENTATION-inner = inner&lt;BR /&gt;SEGMENTATION-outer = outer&lt;BR /&gt;SEGMENTATION-raw = none&lt;BR /&gt;SEGMENTATION-standard = standard&lt;BR /&gt;SHOULD_LINEMERGE = True&lt;BR /&gt;TRANSFORMS =&lt;BR /&gt;TRUNCATE = 10000&lt;BR /&gt;detect_trailing_nulls = false&lt;BR /&gt;maxDist = 100&lt;BR /&gt;priority =&lt;BR /&gt;sourcetype =&lt;BR /&gt;termFrequencyWeightedDist = false&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm not sure what I am missing but I can't get the duplicate field values to cease.&lt;/P&gt;</description>
    <pubDate>Tue, 28 Dec 2021 22:10:27 GMT</pubDate>
    <dc:creator>dtow1</dc:creator>
    <dc:date>2021-12-28T22:10:27Z</dc:date>
    <item>
      <title>JSON events duplicate extractions</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/JSON-events-duplicate-extractions/m-p/579416#M102290</link>
      <description>&lt;P&gt;I've got a log file that I am monitoring and where I am using a props.conf on the UF to monitor. I'm using the following settings:&lt;/P&gt;&lt;P&gt;UF - props.conf:&lt;/P&gt;&lt;P&gt;[my_sourcetype]&lt;/P&gt;&lt;P&gt;INDEXED_EXTRACTIONS = JSON&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Search head cluster&amp;nbsp; (via deployer in an app bundle) props.conf:&lt;/P&gt;&lt;P&gt;[my_sourcetype]&lt;/P&gt;&lt;P&gt;KV_MODE = NONE&lt;BR /&gt;AUTO_KV_JSON = FALSE&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If I run btool on one of the search heads for that sourcetype I get:&lt;BR /&gt;ADD_EXTRA_TIME_FIELDS = True&lt;BR /&gt;ANNOTATE_PUNCT = True&lt;BR /&gt;AUTO_KV_JSON = FALSE&lt;BR /&gt;BREAK_ONLY_BEFORE =&lt;BR /&gt;BREAK_ONLY_BEFORE_DATE = True&lt;BR /&gt;CHARSET = UTF-8&lt;BR /&gt;DATETIME_CONFIG = /etc/datetime.xml&lt;BR /&gt;DEPTH_LIMIT = 1000&lt;BR /&gt;DETERMINE_TIMESTAMP_DATE_WITH_SYSTEM_TIME = false&lt;BR /&gt;HEADER_MODE =&lt;BR /&gt;KV_MODE = NONE&lt;BR /&gt;LB_CHUNK_BREAKER_TRUNCATE = 2000000&lt;BR /&gt;LEARN_MODEL = true&lt;BR /&gt;LEARN_SOURCETYPE = true&lt;BR /&gt;LINE_BREAKER_LOOKBEHIND = 100&lt;BR /&gt;MATCH_LIMIT = 100000&lt;BR /&gt;MAX_DAYS_AGO = 2000&lt;BR /&gt;MAX_DAYS_HENCE = 2&lt;BR /&gt;MAX_DIFF_SECS_AGO = 3600&lt;BR /&gt;MAX_DIFF_SECS_HENCE = 604800&lt;BR /&gt;MAX_EVENTS = 256&lt;BR /&gt;MAX_TIMESTAMP_LOOKAHEAD = 128&lt;BR /&gt;MUST_BREAK_AFTER =&lt;BR /&gt;MUST_NOT_BREAK_AFTER =&lt;BR /&gt;MUST_NOT_BREAK_BEFORE =&lt;BR /&gt;SEGMENTATION = indexing&lt;BR /&gt;SEGMENTATION-all = full&lt;BR /&gt;SEGMENTATION-inner = inner&lt;BR /&gt;SEGMENTATION-outer = outer&lt;BR /&gt;SEGMENTATION-raw = none&lt;BR /&gt;SEGMENTATION-standard = standard&lt;BR /&gt;SHOULD_LINEMERGE = True&lt;BR /&gt;TRANSFORMS =&lt;BR /&gt;TRUNCATE = 10000&lt;BR /&gt;detect_trailing_nulls = false&lt;BR /&gt;maxDist = 100&lt;BR /&gt;priority =&lt;BR /&gt;sourcetype =&lt;BR /&gt;termFrequencyWeightedDist = false&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm not sure what I am missing but I can't get the duplicate field values to cease.&lt;/P&gt;</description>
      <pubDate>Tue, 28 Dec 2021 22:10:27 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/JSON-events-duplicate-extractions/m-p/579416#M102290</guid>
      <dc:creator>dtow1</dc:creator>
      <dc:date>2021-12-28T22:10:27Z</dc:date>
    </item>
  </channel>
</rss>

