<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Searches taking long time to show results in Monitoring Splunk</title>
    <link>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680483#M10064</link>
    <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/239523"&gt;@power12&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The best place to start is by analyzing&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Search/ViewsearchjobpropertieswiththeJobInspector" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/latest/Search/ViewsearchjobpropertieswiththeJobInspector&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Use the &lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/DMC/DMCoverview" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/latest/DMC/DMCoverview&lt;/A&gt;&amp;nbsp; to check the health of Splunk.&lt;/P&gt;&lt;P&gt;Also use other OS related tools to troubleshoot system performance; &lt;STRONG&gt;vmstat&lt;/STRONG&gt;, &lt;STRONG&gt;iostat&lt;/STRONG&gt;, &lt;STRONG&gt;top&lt;/STRONG&gt;, &lt;STRONG&gt;lsof&lt;/STRONG&gt; to look for any processes hogging CPU, memory or any high iowait times on your disk array.&lt;/P&gt;&lt;P&gt;Here is a good explanation of calculating limits:&lt;/P&gt;&lt;P&gt;&lt;A href="https://answers.splunk.com/answers/270544/how-to-calculate-splunk-search-concurrency-limit-f.html" target="_blank"&gt;https://answers.splunk.com/answers/270544/how-to-calculate-splunk-search-concurrency-limit-f.html&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also check out these apps:&lt;BR /&gt;&lt;A href="https://splunkbase.splunk.com/app/2632/" target="_blank"&gt;https://splunkbase.splunk.com/app/2632/&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Check the efficiency of your users searches. The following will show you the longest running searches (by user - run it for 24hrs)&lt;/P&gt;&lt;PRE&gt;index="_audit" action="search" (id=* OR search_id=*)&lt;BR /&gt;| eval user=if(user=="n/a",null(),user)&lt;BR /&gt;| stats max(total_run_time) as total_run_time first(user) as user by search_id&lt;BR /&gt;| stats count perc95(total_run_time) median(total_run_time) by user &lt;BR /&gt;|sort - perc95(total_run_time)&lt;/PRE&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kiran_panchavat_0-1710307769437.png" style="width: 400px;"&gt;&lt;img src="https://community.splunk.com/t5/image/serverpage/image-id/29728i28F8266CFCC5959C/image-size/medium?v=v2&amp;amp;px=400" role="button" title="kiran_panchavat_0-1710307769437.png" alt="kiran_panchavat_0-1710307769437.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 13 Mar 2024 05:29:55 GMT</pubDate>
    <dc:creator>kiran_panchavat</dc:creator>
    <dc:date>2024-03-13T05:29:55Z</dc:date>
    <item>
      <title>Searches taking long time to show results</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680441#M10062</link>
      <description>&lt;P&gt;I have single instance splunk environment where the license is 100 gb there is another single instance using the same license .and we get data every day around 6 GB both combined . Instance A is very fast but instance B is very slow .(both&amp;nbsp; have same resources)&lt;/P&gt;&lt;P&gt;All searches and dashboards are really slow .For instance if I run a search to do a simple stats for 24 hrs ..it takes 25 seconds when compared to the other one which takes 2 seconds .I checked the job inspection which was showing&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;dispatch.evaluate.search = 12.84&lt;BR /&gt;dispatch.fetch.rcp.phase_0 =7.78&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I want to know where should I start checking on the host and what are the steps to be taken&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 12 Mar 2024 19:59:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680441#M10062</guid>
      <dc:creator>power12</dc:creator>
      <dc:date>2024-03-12T19:59:09Z</dc:date>
    </item>
    <item>
      <title>Re: Searches taking long time to show results</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680461#M10063</link>
      <description>&lt;P&gt;Is the data the same data or different?&lt;/P&gt;&lt;P&gt;What is the search in each case.&lt;/P&gt;&lt;P&gt;Take a look at the job inspector and job properties&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.splunk.com/en_us/blog/tips-and-tricks/splunk-clara-fication-job-inspector.html" target="_blank"&gt;https://www.splunk.com/en_us/blog/tips-and-tricks/splunk-clara-fication-job-inspector.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Have a look at the phase0 job property in each case and also look at the LISPY in the search.log&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Mar 2024 00:30:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680461#M10063</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2024-03-13T00:30:38Z</dc:date>
    </item>
    <item>
      <title>Re: Searches taking long time to show results</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680483#M10064</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/239523"&gt;@power12&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The best place to start is by analyzing&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Search/ViewsearchjobpropertieswiththeJobInspector" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/latest/Search/ViewsearchjobpropertieswiththeJobInspector&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Use the &lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/DMC/DMCoverview" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/latest/DMC/DMCoverview&lt;/A&gt;&amp;nbsp; to check the health of Splunk.&lt;/P&gt;&lt;P&gt;Also use other OS related tools to troubleshoot system performance; &lt;STRONG&gt;vmstat&lt;/STRONG&gt;, &lt;STRONG&gt;iostat&lt;/STRONG&gt;, &lt;STRONG&gt;top&lt;/STRONG&gt;, &lt;STRONG&gt;lsof&lt;/STRONG&gt; to look for any processes hogging CPU, memory or any high iowait times on your disk array.&lt;/P&gt;&lt;P&gt;Here is a good explanation of calculating limits:&lt;/P&gt;&lt;P&gt;&lt;A href="https://answers.splunk.com/answers/270544/how-to-calculate-splunk-search-concurrency-limit-f.html" target="_blank"&gt;https://answers.splunk.com/answers/270544/how-to-calculate-splunk-search-concurrency-limit-f.html&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Also check out these apps:&lt;BR /&gt;&lt;A href="https://splunkbase.splunk.com/app/2632/" target="_blank"&gt;https://splunkbase.splunk.com/app/2632/&lt;/A&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Check the efficiency of your users searches. The following will show you the longest running searches (by user - run it for 24hrs)&lt;/P&gt;&lt;PRE&gt;index="_audit" action="search" (id=* OR search_id=*)&lt;BR /&gt;| eval user=if(user=="n/a",null(),user)&lt;BR /&gt;| stats max(total_run_time) as total_run_time first(user) as user by search_id&lt;BR /&gt;| stats count perc95(total_run_time) median(total_run_time) by user &lt;BR /&gt;|sort - perc95(total_run_time)&lt;/PRE&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="kiran_panchavat_0-1710307769437.png" style="width: 400px;"&gt;&lt;img src="https://community.splunk.com/t5/image/serverpage/image-id/29728i28F8266CFCC5959C/image-size/medium?v=v2&amp;amp;px=400" role="button" title="kiran_panchavat_0-1710307769437.png" alt="kiran_panchavat_0-1710307769437.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 13 Mar 2024 05:29:55 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680483#M10064</guid>
      <dc:creator>kiran_panchavat</dc:creator>
      <dc:date>2024-03-13T05:29:55Z</dc:date>
    </item>
    <item>
      <title>Re: Searches taking long time to show results</title>
      <link>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680548#M10066</link>
      <description>Is the data stored same way on both environments? Like are there same indexes, source types, props / transforms etc. on both environment?&lt;BR /&gt;Are the IO resources equally on both nodes? Are there running any other stuff that splunk on those nodes?&lt;BR /&gt;You should setup MC on both nodes and look from it what there are happening. Start with health check part. It will tell if there are some configurations which are not based on Splunk's requirements.</description>
      <pubDate>Wed, 13 Mar 2024 13:44:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Monitoring-Splunk/Searches-taking-long-time-to-show-results/m-p/680548#M10066</guid>
      <dc:creator>isoutamo</dc:creator>
      <dc:date>2024-03-13T13:44:15Z</dc:date>
    </item>
  </channel>
</rss>

