Splunk Search

How to write a search to alert if indexers are not receiving data from forwarders?

Path Finder

Hi Team,

How do I write a search to alert me when one of the critical indexers is not receiving the data from the source?

SplunkTrust
SplunkTrust

If your on v6.4 you could use the DMC to monitor your forwarders, if on an earlier version then you can do something like this

index=_internal sourcetype=splunkd destPort!="-"| stats sparkline count by hostname, sourceHost, host, destPort, version | rename destPort as "Destination Port" | rename host as "Indexer" | rename sourceHost as "Forwarder IP" | rename version as "Splunk Forwarder Version" | rename hostname as "Forwarder Host Name" | rename sparkline as "Traffic Frequency" | sort - count
0 Karma

Revered Legend

You can use following query to check if you're receiving data from a particular source OR not.

Assuming your threshold time period for not reporting is 1 hour/3600 secs. Run below search for a period longer then 1 hr and setup alert when there are records retured

Using metadata command
If you want to know based on host

| metadata type=hosts index=yourindexNameHere | where host=yourHostNameHere| eval age=(recentTime-now()) | where age>3600 | table host recentTime age | convert ctime(recentTime)

For sourcetype, use

| metadata type=sourcetypes index=yourindexNameHere | where sourcetype=yourSourcetypeNameHere| eval age=(recentTime-now()) | where age>3600 | table sourcetype recentTime age | convert ctime(recentTime)

For source, use

| metadata type=sources index=yourindexNameHere | where source=yourSourceNameHere| eval age=(recentTime-now()) | where age>3600 | table source recentTime age | convert ctime(recentTime)

Using tstats
Just replace sourcetype with any other metadata field that you want to use.

| tstats max(_time) as recentTime WHERE index=yourindexNameHere  by sourcetype | where sourcetype=yourSourcetypeNameHere| eval age=(recentTime-now()) | where age>3600 | table sourcetype recentTime age | convert ctime(recentTime)

Path Finder

@somesoni2 :thanks much for the answer. I should have put the question in the right way ..my bad !
Basically I wanted to check if critical "indexes" but not "indexer" are receiving the data or not..
Does the above answer's apply for indexes as well?

0 Karma

Revered Legend

It actually applies to indexes only. If you've setup your indexers in standard way, all indexers are important. What you want is very common Splunk monitoring use-case. Hope the answer helps.

0 Karma

Path Finder

Hi somesoni2, in your statement "Assuming your threshold time period for not reporting is 1 hour/3600 secs. Run below search for a period longer then 1 hr and setup alert when there are records retured" did you mean " setup alert when there are records "NOT "returned ?

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!