<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to query for similar events from aggregated data and few other criteria. in Splunk Enterprise Security</title>
    <link>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406489#M4725</link>
    <description>&lt;P&gt;You might be able to do this by re-stats-ing the stats (lol) and counting by the sum of your bytes.  &lt;/P&gt;

&lt;P&gt;I think you need to mildly rework your actual stats, though, since you want it where the destination is the same (e.g. &lt;CODE&gt;by dest&lt;/CODE&gt; as part of the stats).  We also don't need the source in there, and having it in there might complicate things.  Also I don't see where you are doing the _time in your &lt;CODE&gt;by&lt;/CODE&gt; clause either, so I'm going to assume that's just a copy/paste oversight.  So our new stats is...&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index="abc" | bucket _time span=30m 
| stats sum(bytes) as total_bytes by dest, _time
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;You should then have results that are _time (in 30 minute chunks), dest, and total_bytes.&lt;/P&gt;

&lt;P&gt;Now we want to count these results, looking for the same dest and same total_bytes.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index="abc" | bucket _time span=30m 
| stats sum(bytes) as total_bytes by dest, _time
| stats count by dest, total_bytes
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;So that should give you a consolidated list of how many times total_bytes occurs by dest over the entire time period.  One final piece is to search that result for where is greater than 47.  Or equal to 48.  Or larger than 5, whatever, you'll easily see how to make that happen... &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index="abc" | bucket _time span=30m 
| stats sum(bytes) as total_bytes by dest, _time
| stats count by dest, total_bytes
| search count&amp;gt;47
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Do those, step by step so that you can a) modify it a bit if I got a field name wrong or something, and b) so you understand each piece.  That way if you have a similar problem you should be able to handle it yourself!&lt;/P&gt;

&lt;P&gt;Happy Splunking,&lt;BR /&gt;
Rich&lt;/P&gt;</description>
    <pubDate>Tue, 29 Sep 2020 22:13:00 GMT</pubDate>
    <dc:creator>Richfez</dc:creator>
    <dc:date>2020-09-29T22:13:00Z</dc:date>
    <item>
      <title>How to query for similar events from aggregated data and few other criteria.</title>
      <link>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406488#M4724</link>
      <description>&lt;P&gt;Hi, &lt;/P&gt;

&lt;P&gt;I'm trying to find/create a splunk query for the following.  &lt;/P&gt;

&lt;P&gt;My log is something like below: &lt;/P&gt;

&lt;P&gt;&lt;STRONG&gt;&lt;EM&gt;time=2018-10-26 06:09:21 UTC source=1.2.3.4 dest=5.6.7.8 bytes=100&lt;/EM&gt;&lt;/STRONG&gt;&lt;/P&gt;

&lt;P&gt;I'm aggregating the bytes something like below for 30min interval:&lt;/P&gt;

&lt;P&gt;&lt;STRONG&gt;&lt;EM&gt;index="abc" | bucket _time span=30m | stats values(source), values(dest), sum(bytes) as total_bytes&lt;/EM&gt;&lt;/STRONG&gt;&lt;/P&gt;

&lt;P&gt;If there are some source hosts which are sending to destination with fixed a data sizes (i.e, total_bytes) for every 30min and for last 1 day, then i would like to know those sources using splunk query.&lt;/P&gt;

&lt;P&gt;Basically, following conditions should exists: &lt;/P&gt;

&lt;P&gt;a) bytes are same.  (total_bytes for the current bucket span and previous should be identical)&lt;BR /&gt;
b) destination is same&lt;BR /&gt;
c) time span bucket count 48 (meaning , 30min span for last 24 hours. 24*2 = 48). &lt;/P&gt;

&lt;P&gt;Could you please throw some light on creating the query for the above. I really appreciate your help this regard. &lt;/P&gt;

&lt;P&gt;Thanks,&lt;BR /&gt;
Mahesh&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 22:09:06 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406488#M4724</guid>
      <dc:creator>mahe90</dc:creator>
      <dc:date>2020-09-29T22:09:06Z</dc:date>
    </item>
    <item>
      <title>Re: How to query for similar events from aggregated data and few other criteria.</title>
      <link>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406489#M4725</link>
      <description>&lt;P&gt;You might be able to do this by re-stats-ing the stats (lol) and counting by the sum of your bytes.  &lt;/P&gt;

&lt;P&gt;I think you need to mildly rework your actual stats, though, since you want it where the destination is the same (e.g. &lt;CODE&gt;by dest&lt;/CODE&gt; as part of the stats).  We also don't need the source in there, and having it in there might complicate things.  Also I don't see where you are doing the _time in your &lt;CODE&gt;by&lt;/CODE&gt; clause either, so I'm going to assume that's just a copy/paste oversight.  So our new stats is...&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index="abc" | bucket _time span=30m 
| stats sum(bytes) as total_bytes by dest, _time
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;You should then have results that are _time (in 30 minute chunks), dest, and total_bytes.&lt;/P&gt;

&lt;P&gt;Now we want to count these results, looking for the same dest and same total_bytes.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index="abc" | bucket _time span=30m 
| stats sum(bytes) as total_bytes by dest, _time
| stats count by dest, total_bytes
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;So that should give you a consolidated list of how many times total_bytes occurs by dest over the entire time period.  One final piece is to search that result for where is greater than 47.  Or equal to 48.  Or larger than 5, whatever, you'll easily see how to make that happen... &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index="abc" | bucket _time span=30m 
| stats sum(bytes) as total_bytes by dest, _time
| stats count by dest, total_bytes
| search count&amp;gt;47
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Do those, step by step so that you can a) modify it a bit if I got a field name wrong or something, and b) so you understand each piece.  That way if you have a similar problem you should be able to handle it yourself!&lt;/P&gt;

&lt;P&gt;Happy Splunking,&lt;BR /&gt;
Rich&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 22:13:00 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406489#M4725</guid>
      <dc:creator>Richfez</dc:creator>
      <dc:date>2020-09-29T22:13:00Z</dc:date>
    </item>
    <item>
      <title>Re: How to query for similar events from aggregated data and few other criteria.</title>
      <link>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406490#M4726</link>
      <description>&lt;P&gt;Thank you , Rich. &lt;/P&gt;</description>
      <pubDate>Sat, 01 Dec 2018 02:46:49 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Enterprise-Security/How-to-query-for-similar-events-from-aggregated-data-and-few/m-p/406490#M4726</guid>
      <dc:creator>mahe90</dc:creator>
      <dc:date>2018-12-01T02:46:49Z</dc:date>
    </item>
  </channel>
</rss>

