<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Why is there an error of &amp;quot;Too many subsearches&amp;quot; when ingesting logs from haproxies running broken out by each API call? in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420715#M120889</link>
    <description>&lt;P&gt;@aalvino73, you should try to avoid sub-searches until absolutely unnecessary. In your case your query can work without sub-searches. Try the following and confirm!&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;sourcetype=haproxy:http status=200  ("API1?" OR "API2?" OR "API3?")
| eval API=case(searchmatch("API1?"),"API1",
                searchmatch("API2?"),"API2",
                searchmatch("API3?"),"API3",
                true(),"unknown")
| stats avg(date_second) as Average by API
| eval Average=round(Average,2)
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Also when you end-up using &lt;CODE&gt;transpose&lt;/CODE&gt; or &lt;CODE&gt;xyseries&lt;/CODE&gt; or &lt;CODE&gt;untable&lt;/CODE&gt; commands to format the table output, you should also consider whether it is possible to construct the final output without using those.&lt;/P&gt;</description>
    <pubDate>Sun, 21 Apr 2019 04:46:27 GMT</pubDate>
    <dc:creator>niketn</dc:creator>
    <dc:date>2019-04-21T04:46:27Z</dc:date>
    <item>
      <title>Why is there an error of "Too many subsearches" when ingesting logs from haproxies running broken out by each API call?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420714#M120888</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;

&lt;P&gt;Any help is greatly appreciated as I am of course in a bit of a time crunch.&lt;BR /&gt;&lt;BR /&gt;
We are currently using splunk to ingest our logs from haproxies running in our environment.  The haproxies front a number of services we offer using API calls.&lt;/P&gt;

&lt;P&gt;I am trying to generate a report that breaks down the average response time (RTT in the haproxy log) broken out by each API call.  &lt;/P&gt;

&lt;P&gt;I found I can do this using this search:&lt;BR /&gt;
&lt;STRONG&gt;sourcetype=haproxy:http status=200 "API1?" | stats avg(rtt) as API1 | &lt;BR /&gt;
appendcols [search "API2?" | stats avg(rtt) as API2] | &lt;BR /&gt;
appendcols [search "API3?" | stats avg(rtt) as API3] | &lt;BR /&gt;
appendcols [search  "API4?" | stats avg(rtt) as API4] | &lt;BR /&gt;
transpose&lt;/STRONG&gt;&lt;/P&gt;

&lt;P&gt;I then get the table that I need with the first column being the APIs and the second column being the average response time for each request to that API call. &lt;/P&gt;

&lt;P&gt;The problem is that I have about 40 API calls that I need to generate in this report.  When I get to 20 subsearches, then I get an error of "Too many subsearches".  &lt;/P&gt;

&lt;P&gt;Does anyone know if there is a way to workaround this?  &lt;/P&gt;

&lt;P&gt;Thank you very much.&lt;/P&gt;

&lt;P&gt;Tony&lt;/P&gt;</description>
      <pubDate>Sat, 20 Apr 2019 14:13:23 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420714#M120888</guid>
      <dc:creator>aalvino73</dc:creator>
      <dc:date>2019-04-20T14:13:23Z</dc:date>
    </item>
    <item>
      <title>Re: Why is there an error of "Too many subsearches" when ingesting logs from haproxies running broken out by each API call?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420715#M120889</link>
      <description>&lt;P&gt;@aalvino73, you should try to avoid sub-searches until absolutely unnecessary. In your case your query can work without sub-searches. Try the following and confirm!&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;sourcetype=haproxy:http status=200  ("API1?" OR "API2?" OR "API3?")
| eval API=case(searchmatch("API1?"),"API1",
                searchmatch("API2?"),"API2",
                searchmatch("API3?"),"API3",
                true(),"unknown")
| stats avg(date_second) as Average by API
| eval Average=round(Average,2)
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Also when you end-up using &lt;CODE&gt;transpose&lt;/CODE&gt; or &lt;CODE&gt;xyseries&lt;/CODE&gt; or &lt;CODE&gt;untable&lt;/CODE&gt; commands to format the table output, you should also consider whether it is possible to construct the final output without using those.&lt;/P&gt;</description>
      <pubDate>Sun, 21 Apr 2019 04:46:27 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420715#M120889</guid>
      <dc:creator>niketn</dc:creator>
      <dc:date>2019-04-21T04:46:27Z</dc:date>
    </item>
    <item>
      <title>Re: Why is there an error of "Too many subsearches" when ingesting logs from haproxies running broken out by each API call?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420716#M120890</link>
      <description>&lt;P&gt;@niketnilay - Thank you so much!  That is very helpful!  It worked perfectly and I was able to add the counts as well to the output which will help with us prioritizing where we should focus some optimization efforts.  &lt;/P&gt;

&lt;P&gt;Thanks!&lt;/P&gt;

&lt;P&gt;Tony &lt;/P&gt;</description>
      <pubDate>Sun, 21 Apr 2019 14:27:45 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420716#M120890</guid>
      <dc:creator>aalvino73</dc:creator>
      <dc:date>2019-04-21T14:27:45Z</dc:date>
    </item>
    <item>
      <title>Re: Why is there an error of "Too many subsearches" when ingesting logs from haproxies running broken out by each API call?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420717#M120891</link>
      <description>&lt;P&gt;@aalvino73 I am glad the solution worked. Do accept/up vote the answer &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;

&lt;P&gt;Do read the Splunk Documentation for &lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Search/Abouteventcorrelation"&gt;Event Grouping and Correlation&lt;/A&gt; and &lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Search/Quicktipsforoptimization"&gt;Quick Tips for Search Optimization&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 21 Apr 2019 15:00:02 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-is-there-an-error-of-quot-Too-many-subsearches-quot-when/m-p/420717#M120891</guid>
      <dc:creator>niketn</dc:creator>
      <dc:date>2019-04-21T15:00:02Z</dc:date>
    </item>
  </channel>
</rss>

