<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Why are we getting excessive number of alerts? in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444287#M126014</link>
    <description>&lt;P&gt;&lt;CODE&gt;_index_earliest=-15m _index_latest=now index=your index | rest of the stuff&lt;/CODE&gt; works so far as a charm ; -) &lt;/P&gt;</description>
    <pubDate>Wed, 14 Aug 2019 17:01:51 GMT</pubDate>
    <dc:creator>danielbb</dc:creator>
    <dc:date>2019-08-14T17:01:51Z</dc:date>
    <item>
      <title>Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444267#M125994</link>
      <description>&lt;P&gt;We have an &lt;STRONG&gt;All time (real time)&lt;/STRONG&gt; alert which produced 315 alerts in the first eight hours of the day.&lt;/P&gt;

&lt;P&gt;When running the search query of the alert for these eight hours, we get &lt;STRONG&gt;six events&lt;/STRONG&gt;.&lt;/P&gt;

&lt;P&gt;The alert itself is as simple as it gets -&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=&amp;lt;index name&amp;gt;
AND (category="Web Attack"
NOT src IN (&amp;lt;set of IPs&amp;gt;)
)

| table &amp;lt;set of fields&amp;gt;
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;What's going on here?&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 13:32:25 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444267#M125994</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T13:32:25Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444268#M125995</link>
      <description>&lt;P&gt;hi @danielbb - Can you please post the alert configuration, particularly interested in the real time look back wondow&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 15:32:42 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444268#M125995</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T15:32:42Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444269#M125996</link>
      <description>&lt;P&gt;Is this the right view @Sukisen1981?&lt;/P&gt;

&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="alt text"&gt;&lt;img src="https://community.splunk.com/t5/image/serverpage/image-id/7493iB453FDF15305CDA7/image-size/large?v=v2&amp;amp;px=999" role="button" title="alt text" alt="alt text" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 15:36:20 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444269#M125996</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T15:36:20Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444270#M125997</link>
      <description>&lt;P&gt;hi @danielbb - see this, &lt;A href="https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/Specifyrealtimewindowsinyoursearch"&gt;https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/Specifyrealtimewindowsinyoursearch&lt;/A&gt;&lt;BR /&gt;
and this&lt;BR /&gt;
&lt;A href="https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/Specifyrealtimewindowsinyoursearch"&gt;https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/Specifyrealtimewindowsinyoursearch&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Try setting  the default_backfill to false and see?&lt;/P&gt;

&lt;P&gt;[realtime]&lt;/P&gt;

&lt;P&gt;default_backfill = &lt;BR /&gt;
* Specifies if windowed real-time searches should backfill events&lt;BR /&gt;
* Defaults to true&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 16:01:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444270#M125997</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T16:01:15Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444271#M125998</link>
      <description>&lt;P&gt;The doc says - &lt;STRONG&gt;For windowed real-time searches, you can backfill&lt;/STRONG&gt;, but we don't use windowed real-time searches.&lt;/P&gt;

&lt;P&gt;From the UI, the only relevant option seems to be the &lt;STRONG&gt;Expires&lt;/STRONG&gt; at 10 hours. Can it have anything to do with us?  &lt;/P&gt;

&lt;P&gt;Btw, where can we set "windowed" real time searches versus "all-time" real-time searches?&lt;/P&gt;

&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="alt text"&gt;&lt;img src="https://community.splunk.com/t5/image/serverpage/image-id/7494i2EE580E8CC3C7A51/image-size/large?v=v2&amp;amp;px=999" role="button" title="alt text" alt="alt text" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 16:45:54 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444271#M125998</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T16:45:54Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444272#M125999</link>
      <description>&lt;P&gt;Hi @danielbb &lt;BR /&gt;
May I ask why you need a real time alert in the first place? As a thumb rule, it is better to avoid a real time alert.&lt;BR /&gt;
Going by the frequency of the hits you mentioned earler (6 events in 8 hrs) can you not make it a scheduled alert running say every hour / hourly frequency or even on a 3 mins scheduled window?&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 17:03:58 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444272#M125999</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T17:03:58Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444273#M126000</link>
      <description>&lt;P&gt;another option you could try perhaps is to throttle the alerts, &lt;A href="https://docs.splunk.com/Documentation/Splunk/7.3.1/Alert/Alertexamples"&gt;https://docs.splunk.com/Documentation/Splunk/7.3.1/Alert/Alertexamples&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;So if you throttle the alert for 1 hr from the UI does it reduce your alert received counts?&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 17:05:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444273#M126000</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T17:05:19Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444274#M126001</link>
      <description>&lt;P&gt;Ok, makes perfect sense, however these events have indexing delay that we can't avoid. For these 6 events the delay varies between 1.7 and 12.32 minutes.&lt;/P&gt;

&lt;P&gt;So, is there a way to schedule these "regular" alerts based on _indextime. Meaning, we'll have the alert fires for all events that got indexed in the past 15 minutes, for example.&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 17:11:04 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444274#M126001</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T17:11:04Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444275#M126002</link>
      <description>&lt;P&gt;interesting, try this in search&lt;BR /&gt;
    index=yourindex|your search&lt;BR /&gt;
    | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:%S") | table indextime ,_time &lt;BR /&gt;
    | eval time=strptime(indextime,"%Y-%m-%d %H:%M:%S")&lt;BR /&gt;
    | eval _time=time&lt;BR /&gt;
    | stats count by indextime,_time&lt;BR /&gt;
Is there a 'proper' capture based on indextime or _time&lt;/P&gt;</description>
      <pubDate>Wed, 30 Sep 2020 01:41:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444275#M126002</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2020-09-30T01:41:16Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444276#M126003</link>
      <description>&lt;P&gt;It shows - &lt;/P&gt;

&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="alt text"&gt;&lt;img src="https://community.splunk.com/t5/image/serverpage/image-id/7495iBD57119515F747A8/image-size/large?v=v2&amp;amp;px=999" role="button" title="alt text" alt="alt text" /&gt;&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 17:36:36 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444276#M126003</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T17:36:36Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444277#M126004</link>
      <description>&lt;P&gt;check the statistics tab carefully...any difference in minutes between indextime and _time in the table?&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 17:41:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444277#M126004</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T17:41:16Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444278#M126005</link>
      <description>&lt;P&gt;Not on the first page, but we have lags for some of the events. &lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 17:52:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444278#M126005</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T17:52:47Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444279#M126006</link>
      <description>&lt;P&gt;ok one last test and sorry, I should have asked you before you said there are only 6 events in the last eight hours...so if you use your search criteria before the evals..you should &lt;BR /&gt;
so just add these 2 evals before your table &lt;BR /&gt;
    | eval indextime=strftime(_indextime,"%Y-%m-%d %H:%M:%S") | table indextime ,_time &lt;BR /&gt;
    | eval time=strptime(indextime,"%Y-%m-%d %H:%M:%S")&lt;BR /&gt;
in the table fields add indextime and _time along with the rest.&lt;BR /&gt;
What i am asking is now, we should have just 6 events and in these 6 events is there a difference between indextime and _time , matching what you have describec - 1-7 ~ 12 mins delay?&lt;/P&gt;</description>
      <pubDate>Wed, 30 Sep 2020 01:41:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444279#M126006</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2020-09-30T01:41:19Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444280#M126007</link>
      <description>&lt;P&gt;Right. &lt;BR /&gt;
I found out these six events with a similar query to yours. &lt;/P&gt;

&lt;P&gt;The real thing for me at the moment is that -&lt;/P&gt;

&lt;P&gt;Is there a way to schedule these "regular" alerts based on _indextime. Meaning, we'll have the alert fires for all events that got indexed in the past 15 minutes, for example.&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 18:05:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444280#M126007</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T18:05:47Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444281#M126008</link>
      <description>&lt;P&gt;we perhaps need 1-2 more iterations, but I believe we are making progress &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;BR /&gt;
    _index_earliest=-15m _index_latest=now index=your index | rest of the stuff...&lt;/P&gt;

&lt;P&gt;Now, this should calculate only events that were indexed from 15 mins ago till now...bit closer?&lt;/P&gt;</description>
      <pubDate>Wed, 30 Sep 2020 01:41:25 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444281#M126008</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2020-09-30T01:41:25Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444282#M126009</link>
      <description>&lt;P&gt;I think that's it - love it @Sukisen1981 !!!&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 18:53:27 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444282#M126009</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T18:53:27Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444283#M126010</link>
      <description>&lt;P&gt;not an issue at all .would still be interesting to see if the default_backfill=false works though &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt; &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 19:00:32 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444283#M126010</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T19:00:32Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444284#M126011</link>
      <description>&lt;P&gt;haha - funny&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 19:02:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444284#M126011</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T19:02:19Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444285#M126012</link>
      <description>&lt;P&gt;@Sukisen1981 - please convert to an answer...&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 19:06:34 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444285#M126012</guid>
      <dc:creator>danielbb</dc:creator>
      <dc:date>2019-08-12T19:06:34Z</dc:date>
    </item>
    <item>
      <title>Re: Why are we getting excessive number of alerts?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444286#M126013</link>
      <description>&lt;P&gt;duuno which one to do , but I will convert the last comment into an answer.&lt;BR /&gt;
I rarely get chance to fiddle around with the backend (.conf files) as it is maintained by a different vendor...this default_backfill=false looks interesting....maybe I will play around it with in my local&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2019 19:10:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Why-are-we-getting-excessive-number-of-alerts/m-p/444286#M126013</guid>
      <dc:creator>Sukisen1981</dc:creator>
      <dc:date>2019-08-12T19:10:19Z</dc:date>
    </item>
  </channel>
</rss>

