<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Receiving &amp;quot;blocked=true messages&amp;quot; on Splunk instance in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469158#M192018</link>
    <description>&lt;PRE&gt;&lt;CODE&gt;Queue messages
Queue messages look like

... group=queue, name=parsingqueue, max_size=1000, filled_count=0, empty_count=8, current_size=0, largest_size=2, smallest_size=0

Most of these values are not interesting. But current_size, especially considered in aggregate, across events, can tell you which portions of Splunk indexing are the bottlenecks. If current_size remains near zero, then probably the indexing system is not being taxed in any way. If the queues remain near 1000, then more data is being fed into the system (at the time) than it can process in total.

Sometimes you will see messages such as ... group=queue, name=parsingqueue, blocked!!=true, max_size=1000, filled_count=0, empty_count=8, current_size=0, largest_size=2, smallest_size=0

This message contains the blocked string, indicating that it was full, and someone tried to add more, and couldn't. A queue becomes unblocked as soon as the code pulling items out of it pulls an item. Many blocked queue messages in a sequence indicate that data is not flowing at all for some reason. A few scattered blocked messages indicate that flow control is operating, and is normal for a busy indexer.

If you want to look at the queue data in aggregate, graphing the average of current_size is probably a good starting point.

There are queues in place for data going into the parsing pipeline, and for data between parsing and indexing. Each networking output also has its own queue, which can be useful to determine whether the data is able to be sent promptly, or alternatively whether there's some network or receiving system limitation.
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;It comes out because the size of metric is 500kb or more.&lt;/P&gt;</description>
    <pubDate>Sat, 08 Feb 2020 01:12:14 GMT</pubDate>
    <dc:creator>to4kawa</dc:creator>
    <dc:date>2020-02-08T01:12:14Z</dc:date>
    <item>
      <title>Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469157#M192017</link>
      <description>&lt;P&gt;I noticed on my splunk instance that I am getting messages like these:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;02-07-2020 15:20:36.038 -0500 INFO  Metrics - group=queue, name=typingqueue, blocked=true, max_size_kb=500, current_size_kb=499, current_size=993, largest_size=993, smallest_size=993
02-07-2020 15:21:35.038 -0500 INFO  Metrics - group=queue, name=aggqueue, blocked=true, max_size_kb=1024, current_size_kb=1023, current_size=2035, largest_size=2035, smallest_size=2035
02-07-2020 15:21:35.038 -0500 INFO  Metrics - group=queue, name=auditqueue, blocked=true, max_size_kb=500, current_size_kb=499, current_size=809, largest_size=809, smallest_size=809
02-07-2020 15:21:35.038 -0500 INFO  Metrics - group=queue, name=indexqueue, blocked=true, max_size_kb=500, current_size_kb=499, current_size=998, largest_size=998, smallest_size=998
02-07-2020 15:21:35.038 -0500 INFO  Metrics - group=queue, name=parsingqueue, blocked=true, max_size_kb=6144, current_size_kb=6143, current_size=99, largest_size=99, smallest_size=99
02-07-2020 15:21:35.038 -0500 INFO  Metrics - group=queue, name=splunktcpin, blocked=true, max_size_kb=500, current_size_kb=499, current_size=995, largest_size=995, smallest_size=995
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;How can I resolve this? &lt;/P&gt;

&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="alt text"&gt;&lt;img src="https://community.splunk.com/t5/image/serverpage/image-id/8376iCDA8873996364C11/image-size/large?v=v2&amp;amp;px=999" role="button" title="alt text" alt="alt text" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 07 Feb 2020 20:23:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469157#M192017</guid>
      <dc:creator>user789</dc:creator>
      <dc:date>2020-02-07T20:23:47Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469158#M192018</link>
      <description>&lt;PRE&gt;&lt;CODE&gt;Queue messages
Queue messages look like

... group=queue, name=parsingqueue, max_size=1000, filled_count=0, empty_count=8, current_size=0, largest_size=2, smallest_size=0

Most of these values are not interesting. But current_size, especially considered in aggregate, across events, can tell you which portions of Splunk indexing are the bottlenecks. If current_size remains near zero, then probably the indexing system is not being taxed in any way. If the queues remain near 1000, then more data is being fed into the system (at the time) than it can process in total.

Sometimes you will see messages such as ... group=queue, name=parsingqueue, blocked!!=true, max_size=1000, filled_count=0, empty_count=8, current_size=0, largest_size=2, smallest_size=0

This message contains the blocked string, indicating that it was full, and someone tried to add more, and couldn't. A queue becomes unblocked as soon as the code pulling items out of it pulls an item. Many blocked queue messages in a sequence indicate that data is not flowing at all for some reason. A few scattered blocked messages indicate that flow control is operating, and is normal for a busy indexer.

If you want to look at the queue data in aggregate, graphing the average of current_size is probably a good starting point.

There are queues in place for data going into the parsing pipeline, and for data between parsing and indexing. Each networking output also has its own queue, which can be useful to determine whether the data is able to be sent promptly, or alternatively whether there's some network or receiving system limitation.
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;It comes out because the size of metric is 500kb or more.&lt;/P&gt;</description>
      <pubDate>Sat, 08 Feb 2020 01:12:14 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469158#M192018</guid>
      <dc:creator>to4kawa</dc:creator>
      <dc:date>2020-02-08T01:12:14Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469159#M192019</link>
      <description>&lt;P&gt;How would I graph that? &lt;/P&gt;</description>
      <pubDate>Mon, 10 Feb 2020 12:55:10 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469159#M192019</guid>
      <dc:creator>user789</dc:creator>
      <dc:date>2020-02-10T12:55:10Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469160#M192020</link>
      <description>&lt;PRE&gt;&lt;CODE&gt;your_log
| extract pairdelim="," kvdelim="="
|table _time current_size_kb
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;viz &amp;gt;&amp;gt; Line chart&lt;/P&gt;</description>
      <pubDate>Mon, 10 Feb 2020 13:01:33 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469160#M192020</guid>
      <dc:creator>to4kawa</dc:creator>
      <dc:date>2020-02-10T13:01:33Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469161#M192021</link>
      <description>&lt;P&gt;Are there any good tutorials on this?  I am new to Splunk.  Thanks. &lt;/P&gt;</description>
      <pubDate>Mon, 10 Feb 2020 13:22:14 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469161#M192021</guid>
      <dc:creator>user789</dc:creator>
      <dc:date>2020-02-10T13:22:14Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469162#M192022</link>
      <description>&lt;P&gt;&lt;A href="https://docs.splunk.com/Documentation/Splunk/8.0.1/SearchTutorial/WelcometotheSearchTutorial"&gt;WelcometotheSearchTutorial&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Is this about?&lt;/P&gt;</description>
      <pubDate>Mon, 10 Feb 2020 13:37:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469162#M192022</guid>
      <dc:creator>to4kawa</dc:creator>
      <dc:date>2020-02-10T13:37:35Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469163#M192023</link>
      <description>&lt;P&gt;I've attached an image - I don't have many hosts reporting to this server, and it looks like I have plenty of ram, even though it's not the recommended number. &lt;/P&gt;</description>
      <pubDate>Mon, 10 Feb 2020 14:51:23 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469163#M192023</guid>
      <dc:creator>user789</dc:creator>
      <dc:date>2020-02-10T14:51:23Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469164#M192024</link>
      <description>&lt;P&gt;why not talk sales vender?&lt;/P&gt;</description>
      <pubDate>Mon, 10 Feb 2020 20:09:56 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469164#M192024</guid>
      <dc:creator>to4kawa</dc:creator>
      <dc:date>2020-02-10T20:09:56Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469165#M192025</link>
      <description>&lt;P&gt;Based on your screenshot, you have multiple compounding issues.&lt;/P&gt;

&lt;P&gt;You need to disable Transparent Huge Pages:&lt;BR /&gt;
&lt;A href="https://docs.splunk.com/Documentation/Splunk/8.0.1/ReleaseNotes/SplunkandTHP"&gt;https://docs.splunk.com/Documentation/Splunk/8.0.1/ReleaseNotes/SplunkandTHP&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Your ulimits are not set correctly and need to be increased:&lt;BR /&gt;
&lt;A href="https://docs.splunk.com/Documentation/Splunk/8.0.1/Troubleshooting/ulimitErrors#Set_limits_using_the_.2Fetc.2Fsystemd_configuration_files"&gt;https://docs.splunk.com/Documentation/Splunk/8.0.1/Troubleshooting/ulimitErrors#Set_limits_using_the_.2Fetc.2Fsystemd_configuration_files&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Your system resources are below the recommendation, which usually means you're running on VMWare.&lt;/P&gt;

&lt;P&gt;If correcting the first two issues does not ease the congestion, you may want to consider increasing the parallel ingestion pipelines.&lt;BR /&gt;
&lt;A href="https://docs.splunk.com/Documentation/Splunk/8.0.1/Indexer/Pipelinesets"&gt;https://docs.splunk.com/Documentation/Splunk/8.0.1/Indexer/Pipelinesets&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 11 Feb 2020 18:16:42 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469165#M192025</guid>
      <dc:creator>codebuilder</dc:creator>
      <dc:date>2020-02-11T18:16:42Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469166#M192026</link>
      <description>&lt;P&gt;None of these seemed to fix it.  I am running on AWS, and it is a c4.4xlarge.&lt;BR /&gt;
As for the ulimit, the file did not exist, so I created it, and added that text to the file. &lt;/P&gt;</description>
      <pubDate>Tue, 18 Feb 2020 20:05:32 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469166#M192026</guid>
      <dc:creator>user789</dc:creator>
      <dc:date>2020-02-18T20:05:32Z</dc:date>
    </item>
    <item>
      <title>Re: Receiving "blocked=true messages" on Splunk instance</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469167#M192027</link>
      <description>&lt;P&gt;I noticed under netstat -tulpn, 9997 is not listening, as is defined under settings -&amp;gt; receive data.  I disabled the receiver (which failed), then received a similar error when re-enabling: &lt;BR /&gt;
    Error occurred attempting to enable 9997: .&lt;/P&gt;</description>
      <pubDate>Tue, 18 Feb 2020 20:18:59 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Receiving-quot-blocked-true-messages-quot-on-Splunk-instance/m-p/469167#M192027</guid>
      <dc:creator>user789</dc:creator>
      <dc:date>2020-02-18T20:18:59Z</dc:date>
    </item>
  </channel>
</rss>

