Splunk Search

Show hosts that stop reporting logs

splunkcol
Contributor

 

Hello, I have many windows machines sending logs through the agent to index = main

With what query can I monitor either from a dashboard or from an alert when one of these machines stops sending logs after an interval of 24 hours?

note: I don't have a deployment server

 

Labels (5)
0 Karma

codebuilder
SplunkTrust
SplunkTrust

If you enable forwarder monitoring on either the master or DMC it will provide you information on forwarders not reporting in or sending data. There are built in alerts for just this purpose, you just have to enable and configure them.

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma

inventsekar
Super Champion

Hi @splunkcol 

https://www.splunk.com/en_us/blog/tips-and-tricks/how-to-determine-when-a-host-stops-sending-logs-to...

Check this Splunk Search Query:

| tstats latest(_time) as latest where index=main earliest=-24h by host
| eval recent = if(latest > relative_time(now(),"-5m"),1,0), realLatest = strftime(latest,"%c")
| where recent=0

 

bowesmana
SplunkTrust
SplunkTrust

Take a look at the excellent TrackMe app

https://splunkbase.splunk.com/app/4621/

This does exactly what you are looking for and a lot more too - it's very easy to use and very intuitive and the developer has some amazingly good documentation for a free application. I have just deployed this at a client and it works beautifully for tracking what index, sourcetypes and hosts have not sent data to Splunk - by default it will alert if data does not arrive for 1 hour, but that is all totally configurable.

NB: I have no connection to the app or the developer, I have just used the app.

 

Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!