<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Splunk alert for peak errors only in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Splunk-alert-for-peak-errors-only/m-p/537501#M151961</link>
    <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/222349"&gt;@shashank_24&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How to determine 'peak'?&lt;/P&gt;&lt;P&gt;There are a number of ways to look at this, but you could do a timechart and look for the count per minute rather than the total over 30 minutes, e.g.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=myindex sourcetype=ssl_access_combined requested_content="/myapp*" NOT images status=50*
| timechart span=1m count by status
| where '500'&amp;gt;10&lt;/LI-CODE&gt;&lt;P&gt;which will look at the number of 500s per minutes and only alert if there are more than 10 in a minute.&lt;/P&gt;&lt;P&gt;Or you could look for outliers, but this would require a bit of tuning for your your data.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=myindex sourcetype=ssl_access_combined requested_content="/myapp*" NOT images status=500 
| bin _time span=1m
| stats count by _time
| streamstats window=10 avg(count) as avg, stdev(count) as stdev
| eval multiplier = 2
| eval lower_bound = avg - (stdev * multiplier)
| eval upper_bound = avg + (stdev * multiplier)
| eval lower_outlier = if(count &amp;lt; lower_bound, 1, 0)
| eval upper_outlier = if(count &amp;gt; upper_bound, 1, 0)
| where upper_outlier=1&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;so this looks for any rolling 10 minute window where the average is 2 standard deviations above the average per minute. Play with the window/multipliers on your historical data to find a number that works for your data. I left the lower_outlier calc in their as an example of picking up values outside the lower bounds also.&lt;/P&gt;&lt;P&gt;Hope this helps.&lt;/P&gt;</description>
    <pubDate>Wed, 27 Jan 2021 22:08:46 GMT</pubDate>
    <dc:creator>bowesmana</dc:creator>
    <dc:date>2021-01-27T22:08:46Z</dc:date>
    <item>
      <title>Splunk alert for peak errors only</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Splunk-alert-for-peak-errors-only/m-p/537477#M151951</link>
      <description>&lt;P&gt;Hi,&amp;nbsp; I am working on a query to write an alert where i need to monitor few pages for 500 Errors. Now currently there are some investigations going on to fix those errors so we usually get few every 30 minutes.&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;So basically at the moment not every 5xx is a problem and I know it's hard for us to see, which of those are "real problems" and which not.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;What I want to do is something like&amp;nbsp;adjust our thresholds in the Splunk search so that it only recognise&amp;nbsp;peaks, which are surely something "going wrong" and then trigger the alert.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I have a alert like this at the moment which is triggered every 30 minutes.&lt;/SPAN&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=myindex sourcetype=ssl_access_combined requested_content="/myapp*" NOT images status=50* 
| stats count by status
| where (status=500 AND count &amp;gt; 10)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Right now it gets triggered when the count of 500 error is 10 in last 30 minutes but I don't want that as it is unnecessary flooding the mailbox.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there a way to achieve what I am required here please?&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jan 2021 18:29:24 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Splunk-alert-for-peak-errors-only/m-p/537477#M151951</guid>
      <dc:creator>shashank_24</dc:creator>
      <dc:date>2021-01-27T18:29:24Z</dc:date>
    </item>
    <item>
      <title>Re: Splunk alert for peak errors only</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Splunk-alert-for-peak-errors-only/m-p/537501#M151961</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/222349"&gt;@shashank_24&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;How to determine 'peak'?&lt;/P&gt;&lt;P&gt;There are a number of ways to look at this, but you could do a timechart and look for the count per minute rather than the total over 30 minutes, e.g.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=myindex sourcetype=ssl_access_combined requested_content="/myapp*" NOT images status=50*
| timechart span=1m count by status
| where '500'&amp;gt;10&lt;/LI-CODE&gt;&lt;P&gt;which will look at the number of 500s per minutes and only alert if there are more than 10 in a minute.&lt;/P&gt;&lt;P&gt;Or you could look for outliers, but this would require a bit of tuning for your your data.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=myindex sourcetype=ssl_access_combined requested_content="/myapp*" NOT images status=500 
| bin _time span=1m
| stats count by _time
| streamstats window=10 avg(count) as avg, stdev(count) as stdev
| eval multiplier = 2
| eval lower_bound = avg - (stdev * multiplier)
| eval upper_bound = avg + (stdev * multiplier)
| eval lower_outlier = if(count &amp;lt; lower_bound, 1, 0)
| eval upper_outlier = if(count &amp;gt; upper_bound, 1, 0)
| where upper_outlier=1&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;so this looks for any rolling 10 minute window where the average is 2 standard deviations above the average per minute. Play with the window/multipliers on your historical data to find a number that works for your data. I left the lower_outlier calc in their as an example of picking up values outside the lower bounds also.&lt;/P&gt;&lt;P&gt;Hope this helps.&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jan 2021 22:08:46 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Splunk-alert-for-peak-errors-only/m-p/537501#M151961</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2021-01-27T22:08:46Z</dc:date>
    </item>
  </channel>
</rss>

