<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: TailingProcessor - Could not send data to output queue (parsingQueue), retrying... in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64154#M12892</link>
    <description>&lt;P&gt;if you run:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=_internal source=*metrics.log* group=queue | timechart perc95(current_size) by name
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;what kind of numbers do you see? If you use heat map, what colors do you see for each of the queues?&lt;/P&gt;

&lt;P&gt;Lastly, can you confirm that the indexer is on the same timezone as the logs timestamp?&lt;/P&gt;</description>
    <pubDate>Fri, 08 Oct 2010 13:22:18 GMT</pubDate>
    <dc:creator>Genti</dc:creator>
    <dc:date>2010-10-08T13:22:18Z</dc:date>
    <item>
      <title>TailingProcessor - Could not send data to output queue (parsingQueue), retrying...</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64151#M12889</link>
      <description>&lt;P&gt;Recently, I've begun noticing that one of our lightweight forwarders is not sending data that we expect to see on the indexer (4.1.4 Linux 64-bit).&lt;/P&gt;

&lt;P&gt;Looking at the LWF (4.1.4 on HPUX 11.31 IA64), I see the following in the logs:&lt;/P&gt;

&lt;P&gt;10-05-2010 12:36:27.189 INFO  TailingProcessor - Could not send data to output queue (parsingQueue), retrying...&lt;BR /&gt;
10-05-2010 12:36:27.189 INFO  TailingProcessor -   ...continuing.&lt;/P&gt;

&lt;P&gt;I had a look at&lt;/P&gt;

&lt;P&gt;&lt;A href="http://answers.splunk.com/questions/5590/could-not-send-data-to-the-output-queue" rel="nofollow"&gt;http://answers.splunk.com/questions/5590/could-not-send-data-to-the-output-queue&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;but that doesn't quite seem to be my problem unless I'm missing something.&lt;/P&gt;

&lt;P&gt;If I do "grep blocked=true var/log/splunk/metrics.log*" I get a few matches, but nothing current.  I certainly see plenty of INFO events from the metrics log showing up on the indexer.&lt;/P&gt;

&lt;P&gt;So if I do a "splunk list monitor" I indeed see the file I expect to be monitored in the list, but I don't see any events showing up on the indexer itself.&lt;/P&gt;

&lt;P&gt;What am I missing (other than my monitored files &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt; )?&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Tue, 05 Oct 2010 23:53:26 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64151#M12889</guid>
      <dc:creator>mfrost8</dc:creator>
      <dc:date>2010-10-05T23:53:26Z</dc:date>
    </item>
    <item>
      <title>Re: TailingProcessor - Could not send data to output queue (parsingQueue), retrying...</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64152#M12890</link>
      <description>&lt;P&gt;So, Per that answer page, are you only seeing blocked=true on the forwarder or indexer as well?&lt;BR /&gt;&lt;BR /&gt;
What are the queues that are getting blocked?&lt;/P&gt;</description>
      <pubDate>Wed, 06 Oct 2010 00:10:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64152#M12890</guid>
      <dc:creator>Genti</dc:creator>
      <dc:date>2010-10-06T00:10:19Z</dc:date>
    </item>
    <item>
      <title>Re: TailingProcessor - Could not send data to output queue (parsingQueue), retrying...</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64153#M12891</link>
      <description>&lt;P&gt;The LWF's most recent "blocked=true" message is from 10/1 on the parsingqueue.  The indexer has quite a few of them for several queues: aggqueue, indexqueue and typingqueue.  83 in total over the last few days on the indexer.  While I'd be the first to admit that the indexer is a woefully underpowered piece of hardware (we're working to replace it), it's still only being asked to index less than 1GB/day.  Events from this particular LWF show up eventually, but as of right now, the most recent events are from noon today.&lt;/P&gt;</description>
      <pubDate>Wed, 06 Oct 2010 07:03:10 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64153#M12891</guid>
      <dc:creator>mfrost8</dc:creator>
      <dc:date>2010-10-06T07:03:10Z</dc:date>
    </item>
    <item>
      <title>Re: TailingProcessor - Could not send data to output queue (parsingQueue), retrying...</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64154#M12892</link>
      <description>&lt;P&gt;if you run:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=_internal source=*metrics.log* group=queue | timechart perc95(current_size) by name
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;what kind of numbers do you see? If you use heat map, what colors do you see for each of the queues?&lt;/P&gt;

&lt;P&gt;Lastly, can you confirm that the indexer is on the same timezone as the logs timestamp?&lt;/P&gt;</description>
      <pubDate>Fri, 08 Oct 2010 13:22:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64154#M12892</guid>
      <dc:creator>Genti</dc:creator>
      <dc:date>2010-10-08T13:22:18Z</dc:date>
    </item>
    <item>
      <title>Re: TailingProcessor - Could not send data to output queue (parsingQueue), retrying...</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64155#M12893</link>
      <description>&lt;P&gt;I'm embarrassed to say that this was my own error.  I hadn't noticed the error in the logs that indirectly was telling me that there was a problem with my outputs.conf.  I had ruled that out early given that splunk log events were clearly going back to the indexer, but my local app events were not.  But it was ultimately a bad outputs.conf file that slipped in there.&lt;/P&gt;

&lt;P&gt;The symptom I originally posted about was not the real culprit.  Sorry for the confusion and thanks to those who tried to help me out.&lt;/P&gt;</description>
      <pubDate>Tue, 12 Oct 2010 04:15:06 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64155#M12893</guid>
      <dc:creator>mfrost8</dc:creator>
      <dc:date>2010-10-12T04:15:06Z</dc:date>
    </item>
    <item>
      <title>Re: TailingProcessor - Could not send data to output queue (parsingQueue), retrying...</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64156#M12894</link>
      <description>&lt;P&gt;Could you please share what you had and what you have on outputs.conf? It will be very good to share the possible misconfiguration resolution...thanks a lot, cheers!&lt;/P&gt;</description>
      <pubDate>Wed, 18 Sep 2013 16:37:04 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/TailingProcessor-Could-not-send-data-to-output-queue/m-p/64156#M12894</guid>
      <dc:creator>wagnerbianchi</dc:creator>
      <dc:date>2013-09-18T16:37:04Z</dc:date>
    </item>
  </channel>
</rss>

