I have Splunk forwarder 6.1.x installed on my servers. Splunk monitoring (many alerts) has been set up for these servers. However, a few servers will be decommissioned (will not be used for sometime). So I want to disable all the Splunk monitoring for these servers for some time period without stopping the Splunk process or without modifying the already set up alerts (modifying all alerts will be tedious). Is there a possibility for this in Splunk? Once the server is needed again, I need to reenable Splunk monitoring. So please suggest how to disable Splunk monitoring for some time period and reenable them later?
Kindly help me out.
A high level solution is to maintain a lookup of your planned maintenances, and use it as a filter to decide to trigger the alert.
pseudo example :
maintenance.csv host,start,end,mode server1,1438122595,1438124595, maintenance server1,1438122595,1438124595, maintenance
and the search alert
<search for error> | lookup maintenance.csv host | search NOT mode="maintenance" AND (_time>start AND _time<end)
If you want to go further, look at timebased lookups
It depends a good bit on what you are monitoring and how those searches are crafted. A short answer is simply add "disabled = True" for any monitor statements in your inputs. If, for example, you were monitoring /var/log on a server and added the disabled line the Splunk agent would continue to run but would no longer bring data in from that path.
From an alerting perspective if you are looking simply for the lack of events you could maybe have a scripted input that generated some minor log data every couple hours I guess. MuS' answer with inserting a lookup table into the mix is a good way to go that bakes in some flexibility.
at some of my costumers in use a lookup table which holds all host names and another field called
alert_status which either shows
disabled. Using this lookup table on
host=* as automated lookup will provide for each host a field which shows the
alert_status. The costumers can edit the lookup table either by using the lookup editor https://splunkbase.splunk.com/app/1724/ or by using the
Finally you have to use the field
alert_status in your alert searches, so you only search on:
this is your alert search alert_status=enabled | ...
This will only search for hosts with alerts enabled in the lookup file.
Hope this helps to get you started ...
Thankyou for the response.
However I have a doubt here...
Should all the alerts which I want to disable be mentioned in the lookup ??
The problem is I have many alerts being configured and I don't know the origin of these alerts(Splunk searches).
So If I include the host in the lookup , is it sufficient to disable all the alerts that's been set for a particular host?
Yes, but remember to modify the search to search for the lookup field called
alert_status=enabled based on my example. You will name it what ever fits in your environment. It maybe some effort to modify all the alert searches once, but it is well worth it in the end.