<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Need Help Troubleshooting Poor SplunkWeb Performance in Monitoring Splunk</title>
    <link>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56627#M627</link>
    <description>&lt;P&gt;The _internal index has a default frozen time period of about 30 days, so anything older than that is eligible to be deleted.&lt;/P&gt;</description>
    <pubDate>Thu, 23 Sep 2010 19:37:57 GMT</pubDate>
    <dc:creator>gkanapathy</dc:creator>
    <dc:date>2010-09-23T19:37:57Z</dc:date>
    <item>
      <title>Need Help Troubleshooting Poor SplunkWeb Performance</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56626#M626</link>
      <description>&lt;P&gt;Hi Folks,&lt;/P&gt;

&lt;P&gt;I could use some pointers troubleshooting some Splunk Web performance issues.&lt;/P&gt;

&lt;P&gt;Over the last few weeks, our team has noticed that Splunk Web has become a little unresponsive. Please would submit searches, but the UI would just sit there and not actually do anything, sometime for upto a minute.&lt;/P&gt;

&lt;P&gt;So in true Splunk style, I thought I'd use Splunk itself to try and investigate.&lt;/P&gt;

&lt;P&gt;I've started with the web_service.log file, using this basic search:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=_internal source="/opt/splunk/var/log/splunk/web_service.log" WARNING earliest=-60d | timechart count
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;This shows that from around the end of August, we've been getting over 10,000 'WARNING' messages per day. (I'd love to insert the chart, but I can't).&lt;/P&gt;

&lt;P&gt;These messages are all similar to:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;An unknown view name "xxxxxx" is referenced in the navigation definition for "search"
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Using this search:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=_internal source="/opt/splunk/var/log/splunk/web_service.log" WARNING earliest=-60d | rex "view:[\d]{3}\s-\s(?&amp;lt;body&amp;gt;[^$]*)" | stats count by body
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;The breakdown is as follows:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;An unknown view name "index_status" is referenced in the navigation definition for "search".
An unknown view name "indexing_volume" is referenced in the navigation definition for "search".
An unknown view name "inputs_status" is referenced in the navigation definition for "search".
An unknown view name "pdf_activity" is referenced in the navigation definition for "search".
An unknown view name "scheduler_savedsearch" is referenced in the navigation definition for "search".
An unknown view name "scheduler_status" is referenced in the navigation definition for "search".
An unknown view name "scheduler_user_app" is referenced in the navigation definition for "search".
An unknown view name "search_status" is referenced in the navigation definition for "search".
An unknown view name "splunkd_status" is referenced in the navigation definition for "search".
An unknown view name "splunkweb_status" is referenced in the navigation definition for "search".
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;With 19,291 for each event.&lt;/P&gt;

&lt;P&gt;I tried to query my _internal index a bit more, but noticed a slight issue. Although this Splunk Server has been running since April 2010, this index doesn't have any events prior to 28th August.&lt;/P&gt;

&lt;P&gt;This is a totally weird one!&lt;/P&gt;

&lt;P&gt;The index size isn't anywhere near a size where events would be aged out. It's about 2GB with a limit of 500GB.&lt;/P&gt;

&lt;P&gt;All I can think is that an upgrade (probably 4.1.3 to 4.1.4) has somehow wiped the Index?!!&lt;/P&gt;

&lt;P&gt;So any help on either of these points would be great.&lt;/P&gt;

&lt;P&gt;The SplunkWeb / UI one is most pressing, but thoughts on the _internal index question would be useful too!&lt;/P&gt;

&lt;P&gt;Thanks in advance,&lt;/P&gt;

&lt;P&gt;Graham.&lt;/P&gt;</description>
      <pubDate>Thu, 23 Sep 2010 16:14:04 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56626#M626</guid>
      <dc:creator>gmor</dc:creator>
      <dc:date>2010-09-23T16:14:04Z</dc:date>
    </item>
    <item>
      <title>Re: Need Help Troubleshooting Poor SplunkWeb Performance</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56627#M627</link>
      <description>&lt;P&gt;The _internal index has a default frozen time period of about 30 days, so anything older than that is eligible to be deleted.&lt;/P&gt;</description>
      <pubDate>Thu, 23 Sep 2010 19:37:57 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56627#M627</guid>
      <dc:creator>gkanapathy</dc:creator>
      <dc:date>2010-09-23T19:37:57Z</dc:date>
    </item>
    <item>
      <title>Re: Need Help Troubleshooting Poor SplunkWeb Performance</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56628#M628</link>
      <description>&lt;P&gt;Ahhh... OK, I didn't know that. That's certainly clears up the second point. Now just the performance issue to work out!&lt;/P&gt;

&lt;P&gt;Thanks very much for your reply.&lt;/P&gt;</description>
      <pubDate>Thu, 23 Sep 2010 20:57:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56628#M628</guid>
      <dc:creator>gmor</dc:creator>
      <dc:date>2010-09-23T20:57:15Z</dc:date>
    </item>
    <item>
      <title>Re: Need Help Troubleshooting Poor SplunkWeb Performance</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56629#M629</link>
      <description>&lt;P&gt;I would download and install Firebug, which is a firefox extension.   After it's installed and enabled, log into splunk (using firefox),  open firebug's panels,  switch to the 'Net' panel (you will have to enable it). &lt;/P&gt;

&lt;P&gt;The Net panel will show you the HTTP requests and responses along with the time spent in each.  This will give you a lot of information quickly over which requests are hanging splunk for a few seconds, and which are blameless.&lt;/P&gt;</description>
      <pubDate>Fri, 24 Sep 2010 03:11:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Need-Help-Troubleshooting-Poor-SplunkWeb-Performance/m-p/56629#M629</guid>
      <dc:creator>sideview</dc:creator>
      <dc:date>2010-09-24T03:11:09Z</dc:date>
    </item>
  </channel>
</rss>

