<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: What can be done when the parsing and aggregation queues are filled up? in Monitoring Splunk</title>
    <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462326#M8159</link>
    <description>&lt;P&gt;probably those 2 indexers receiving more data from a particular host most of the time that has parsing or aggregation issues. If all the indexers and configurations are identical, then ideally it should have been same. Do all indexers have similar specifications? &lt;/P&gt;</description>
    <pubDate>Wed, 05 Feb 2020 16:19:40 GMT</pubDate>
    <dc:creator>Vijeta</dc:creator>
    <dc:date>2020-02-05T16:19:40Z</dc:date>
    <item>
      <title>What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462319#M8152</link>
      <description>&lt;P&gt;One out of the eight indexer has the two queues filled up for a couple of hours - parsing and aggregation queues. What can be done besides waiting for them to clear? I believe we use the default queue sizes, which are relatively small...&lt;/P&gt;

&lt;P&gt;&lt;CODE&gt;index=_internal host=&amp;lt;indexer name&amp;gt; "ERROR" sourcetype=splunkd&lt;/CODE&gt; doesn't show much besides communication errors.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 15:32:29 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462319#M8152</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-05T15:32:29Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462320#M8153</link>
      <description>&lt;P&gt;If parsing and aggregation queue blocks for longer time then it will start blocking your splunktcpin and tcpin queues and if you are not using &lt;CODE&gt;useACK&lt;/CODE&gt; on Forwarder then there might be possibility that you'll lose data over network.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:03:26 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462320#M8153</guid>
      <dc:creator>harsmarvania57</dc:creator>
      <dc:date>2020-02-05T16:03:26Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462321#M8154</link>
      <description>&lt;P&gt;@danielbb You can go to Monitoring Console -&amp;gt; Indexing -&amp;gt; Input -&amp;gt; Data Quality and look for any parsing or aggregation issues (line breaks , timestamps etc.) and if they are more in number you can try fixing that source type, it should help reducing your parsing and aggregation queues.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:04:45 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462321#M8154</guid>
      <dc:creator>Vijeta</dc:creator>
      <dc:date>2020-02-05T16:04:45Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462322#M8155</link>
      <description>&lt;P&gt;If that's the case, then why does it appear only on one indexer?&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:07:04 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462322#M8155</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-05T16:07:04Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462323#M8156</link>
      <description>&lt;P&gt;Thank you - how can I clear these two queues? &lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:07:57 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462323#M8156</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-05T16:07:57Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462324#M8157</link>
      <description>&lt;P&gt;I checked and no such issues for this indexer.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:14:33 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462324#M8157</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-05T16:14:33Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462325#M8158</link>
      <description>&lt;P&gt;In your case it looks like due to Aggregation Queue full &amp;amp; back pressure parsing queue is also getting full. Have a look at &lt;A href="https://wiki.splunk.com/Community:HowIndexingWorks"&gt;https://wiki.splunk.com/Community:HowIndexingWorks&lt;/A&gt; , in aggregation queue Line Merging and Timestamp parsing happens. I'll suggest to define &lt;CODE&gt;TIME_FORMAT&lt;/CODE&gt; for as much as log you can so that splunk will parse time stamp quickly.&lt;/P&gt;

&lt;P&gt;Also can you please let us know whether typing queue and indexing queue was also full at same time on that indexer ?&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:15:48 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462325#M8158</guid>
      <dc:creator>harsmarvania57</dc:creator>
      <dc:date>2020-02-05T16:15:48Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462326#M8159</link>
      <description>&lt;P&gt;probably those 2 indexers receiving more data from a particular host most of the time that has parsing or aggregation issues. If all the indexers and configurations are identical, then ideally it should have been same. Do all indexers have similar specifications? &lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:19:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462326#M8159</guid>
      <dc:creator>Vijeta</dc:creator>
      <dc:date>2020-02-05T16:19:40Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462327#M8160</link>
      <description>&lt;P&gt;All the indexers are the same - what's the query to find out parsing issues?&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:21:22 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462327#M8160</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-05T16:21:22Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462328#M8161</link>
      <description>&lt;P&gt;Ok, do you know how to detect parsing issues recorded in &lt;CODE&gt;_internal&lt;/CODE&gt;? &lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:22:12 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462328#M8161</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-05T16:22:12Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462329#M8162</link>
      <description>&lt;P&gt;If you got on the monitoring console -&amp;gt; Indexing-&amp;gt; Input-&amp;gt; Data Quality you can see the list of source type &lt;CODE&gt;Sourcetype Total Issues    Host Count  Source Count    Line Breaking Issues    Timestamp Parsing Issues    Aggregation Issues.&lt;/CODE&gt;&lt;BR /&gt;
You can click a row with highest count in Line Breaking Issues and you will get the detailed information in logs, similarly you can click on timestamp parsing issues count.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:32:14 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462329#M8162</guid>
      <dc:creator>Vijeta</dc:creator>
      <dc:date>2020-02-05T16:32:14Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462330#M8163</link>
      <description>&lt;P&gt;Try below query to find out time parsing issue. &lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=_internal host=&amp;lt;your Indexer&amp;gt; source="/opt/splunk/var/log/splunk/splunkd.log" component=DateParserVerbose
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Even if you don't have timestamp parsing issue, I'll suggest to configure TIME_FORMAT for sources which are ingesting more data so that splunk do not need to find different timeformat in your log.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Feb 2020 16:35:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462330#M8163</guid>
      <dc:creator>harsmarvania57</dc:creator>
      <dc:date>2020-02-05T16:35:18Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462331#M8164</link>
      <description>&lt;P&gt;The issue was fixed by changing &lt;CODE&gt;MAX_TIMESTAMP_LOOKAHEAD&lt;/CODE&gt; from 23 to 35. The timestamp was of 30 characters. &lt;/P&gt;</description>
      <pubDate>Wed, 12 Feb 2020 02:39:34 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/462331#M8164</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2020-02-12T02:39:34Z</dc:date>
    </item>
    <item>
      <title>Re: What can be done when the parsing and aggregation queues are filled up?</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/613958#M9159</link>
      <description>&lt;P&gt;I have enhance the search to provide more details.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=_internal host=&amp;lt;your Indexer&amp;gt; source="/opt/splunk/var/log/splunk/splunkd.log" component=DateParserVerbose 
| rex "Context: source=(?P&amp;lt;sourcetypeissue&amp;gt;\w+)\Shost=(?P&amp;lt;sourcehost&amp;gt;\w+)" 
| stats list(sourcetypeissue) as file_name list(sourcehost)&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 21 Sep 2022 15:42:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/What-can-be-done-when-the-parsing-and-aggregation-queues-are/m-p/613958#M9159</guid>
      <dc:creator>youngsuh</dc:creator>
      <dc:date>2022-09-21T15:42:18Z</dc:date>
    </item>
  </channel>
</rss>

