<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Alert when host not sending data to splunk in Alerting</title>
    <link>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553543#M10549</link>
    <description>&lt;P&gt;Use an &lt;FONT face="courier new,courier"&gt;eval&lt;/FONT&gt; command to add a cluster field, setting the value of the cluster based on the name of the server.&amp;nbsp; Then do your counting by cluster rather than by server.&amp;nbsp; Share your current search and we probably can offer more specific suggestions.&lt;/P&gt;</description>
    <pubDate>Fri, 28 May 2021 15:45:11 GMT</pubDate>
    <dc:creator>richgalloway</dc:creator>
    <dc:date>2021-05-28T15:45:11Z</dc:date>
    <item>
      <title>Alert when host not sending data to splunk</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553537#M10548</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I need to write a query which alerts if any of my host is not sending any logs to splunk in 10mins.&lt;/P&gt;&lt;P&gt;I'm able to get this result.&lt;/P&gt;&lt;P&gt;Challenge here is, I have 2 cluster servers group, if anyone one of the server is sending data then I should exclude the other server from the alert list, not able to write this logic in search.&lt;/P&gt;&lt;P&gt;Example:&lt;/P&gt;&lt;P&gt;Cluster one :&amp;nbsp; Server 1, server 2 and server3&amp;nbsp;&lt;/P&gt;&lt;P&gt;If Server 1 sends data in 10 mins, then I should remove Server 2 and server 3 from the final list.&lt;/P&gt;&lt;P&gt;right now it will only remove server 1 from the list and other 2 servers come in final list.&lt;/P&gt;&lt;P&gt;Like this I have 3 clusters, 2 servers in each cluster.&lt;/P&gt;&lt;P&gt;Please give some ideas on this&lt;/P&gt;&lt;P&gt;Thanks in Advance!&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 14:51:31 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553537#M10548</guid>
      <dc:creator>Sangu</dc:creator>
      <dc:date>2021-05-28T14:51:31Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when host not sending data to splunk</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553543#M10549</link>
      <description>&lt;P&gt;Use an &lt;FONT face="courier new,courier"&gt;eval&lt;/FONT&gt; command to add a cluster field, setting the value of the cluster based on the name of the server.&amp;nbsp; Then do your counting by cluster rather than by server.&amp;nbsp; Share your current search and we probably can offer more specific suggestions.&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 15:45:11 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553543#M10549</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2021-05-28T15:45:11Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when host not sending data to splunk</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553549#M10550</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;| metasearch index=abc sourcetype=abc&lt;BR /&gt;| eval host=lower(host)&lt;BR /&gt;| stats count BY host&lt;BR /&gt;| append&lt;BR /&gt;[ inputlookup Devices.csv&lt;BR /&gt;| eval host=lower(Hostname+".amat.com"), count=0&lt;BR /&gt;| fields host count ]&lt;BR /&gt;| stats sum(count) AS total BY host | where total=0&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;This is the query I used, lookup table has all the servers which needs to send data.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Another details I wanted to add is, not all servers are there in cluster, expect 3 cluster other 10 servers are there which are not part of any cluster.&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 16:55:46 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553549#M10550</guid>
      <dc:creator>Sangu</dc:creator>
      <dc:date>2021-05-28T16:55:46Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when host not sending data to splunk</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553551#M10551</link>
      <description>&lt;P&gt;Perhaps this will help.&amp;nbsp; The case function assigns servers to a cluster.&amp;nbsp; Those that don't belong to a cluster use the host name as the cluster name.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| metasearch index=abc sourcetype=abc
| eval host=lower(host)
| stats count BY host
| append
[ inputlookup Devices.csv
| eval host=lower(Hostname+".amat.com"), count=0
| fields host count ]
| eval cluster=case(host="server1" OR host="server2" OR host="server3","cluster1", host="server4" OR host="server5" OR host="server6", "cluster2", 1==1, host)
| stats sum(count) AS total BY cluster | where total=0&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 17:01:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553551#M10551</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2021-05-28T17:01:40Z</dc:date>
    </item>
    <item>
      <title>Re: Alert when host not sending data to splunk</title>
      <link>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553561#M10552</link>
      <description>&lt;P&gt;this worked. Thank you so much for your help!&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 17:26:54 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Alerting/Alert-when-host-not-sending-data-to-splunk/m-p/553561#M10552</guid>
      <dc:creator>Sangu</dc:creator>
      <dc:date>2021-05-28T17:26:54Z</dc:date>
    </item>
  </channel>
</rss>

