Hello, I have many windows machines sending logs through the agent to index = main
With what query can I monitor either from a dashboard or from an alert when one of these machines stops sending logs after an interval of 24 hours?
note: I don't have a deployment server
If you enable forwarder monitoring on either the master or DMC it will provide you information on forwarders not reporting in or sending data. There are built in alerts for just this purpose, you just have to enable and configure them.
Hi @splunkcol
Check this Splunk Search Query:
| tstats latest(_time) as latest where index=main earliest=-24h by host | eval recent = if(latest > relative_time(now(),"-5m"),1,0), realLatest = strftime(latest,"%c") | where recent=0
Take a look at the excellent TrackMe app
https://splunkbase.splunk.com/app/4621/
This does exactly what you are looking for and a lot more too - it's very easy to use and very intuitive and the developer has some amazingly good documentation for a free application. I have just deployed this at a client and it works beautifully for tracking what index, sourcetypes and hosts have not sent data to Splunk - by default it will alert if data does not arrive for 1 hour, but that is all totally configurable.
NB: I have no connection to the app or the developer, I have just used the app.