Deployment Architecture

How to identify a list of forwarders sending data to a single Splunk indexer?

Splunker4Life
Explorer

Hi all,

I have been trying to identify a list of the current forwarders that are sending data to our single Splunk indexer. Is there a section within Splunk where I can find this or even a search query?

Thanks in advance.
Anu

Labels (1)
Tags (2)
1 Solution

lguinn2
Legend

Here is a search that I often use to check on how much data is being sent per hour, by forwarder.

index=_internal source=*metrics.log group=tcpin_connections 
| eval sourceHost=if(isnull(hostname), sourceHost,hostname) 
| rename connectionType as connectType
| eval connectType=case(fwdType=="uf","univ fwder", fwdType=="lwf", "lightwt fwder",fwdType=="full", "heavy fwder", connectType=="cooked" or connectType=="cookedSSL","Splunk fwder", connectType=="raw" or connectType=="rawSSL","legacy fwder")
| eval version=if(isnull(version),"pre 4.2",version)
| rename version as Ver 
| fields connectType sourceIp sourceHost destPort kb tcp_eps tcp_Kprocessed tcp_KBps splunk_server Ver
| eval Indexer= splunk_server
| eval Hour=relative_time(_time,"@h")
| stats avg(tcp_KBps) sum(tcp_eps) sum(tcp_Kprocessed) sum(kb) by Hour connectType sourceIp sourceHost destPort Indexer Ver
| fieldformat Hour=strftime(Hour,"%x %H")

Just copy this search and paste into your search box - and pick a relatively short time period (like last 24 hours or less). It should run on any Splunk 4.2 or newer. It might work on older versions, but I am not sure...

You could change the stats command if you wanted a slightly different output. For example, replace the last 3 lines with the following to get an overall summary by forwarder, rather than hour by hour statistics:

| stats avg(tcp_KBps) sum(tcp_eps) sum(tcp_Kprocessed) sum(kb) by connectType sourceIp sourceHost destPort Indexer Ver

I originally found this search as part of the Spunk Deployment Monitor. I've been tweaking it ever since.

View solution in original post

vladx
New Member

Here is one with the OS versions too (WIn only)
index=_internal sourcetype=splunkd group=tcpin_connections | stats first(version) by hostname
|rename hostname as host
|join host [
search index=windows_desktop sourcetype=WinHostMon vendor_product="*"
]
|rename host as Hostname, vendor_product as OS, Version as "OS Version", first(version) as "Splunk Version"
|table Hostname, OS, "OS Version", "Splunk Version"
|sort "OS Version"

maybe it is better to use something else than vendor_product, so it is just a quick and dirty solution 🙂

0 Karma

vince2010091
Path Finder

Hello, i use this search :

 index=_internal sourcetype=splunkd group=tcpin_connections | stats first(version) by hostname

ytanskiy
Engager

Thank you. Gave me exactly what I needed.

rameshyedurla
Explorer

try this:
index=_internal sourcetype=splunkd destPort!="-"| stats sparkline count by hostname, sourceHost, host, destPort, version | rename destPort as "Destination Port" | rename host as "Indexer" | rename sourceHost as "Forwarder IP" | rename version as "Splunk Forwarder Version" | rename hostname as "Forwarder Host Name" | rename sparkline as "Traffic Frequency" | sort - count

vince2010091
Path Finder

Hello,
You can use App like Deployment Monitor or S.O.S
Regards

0 Karma

lguinn2
Legend

Today, I would recommend that you use the Distributed Management Console (DMC). It is built-in and works very well.

But sometimes it is nice to have a search like one of these - which will let you look at the forwarders over any time span and allows you to set any criteria that you want. That's particularly useful if you have thousands of forwarders and are only interested in a subset of them.

sanurd
Path Finder

I understand this is a very old post but I had the same requirements and i wanted ONLY the forwarders , i figured out this search works , so I thought I will post the answer , might be useful for someone else. It might a round about way of getting it but does the job.

| metadata type=hosts | fields host | rename host AS splunk_server | where ![| rest splunk_server=local /services/licenser/messages | fields splunk_server | dedup splunk_server | fields + splunk_server] | rename splunk_server AS host

0 Karma

lguinn2
Legend

The trouble with this, is that the hosts listed will be the name of the host specified in inputs.conf; that might or might not match the actual forwarder names.
By using the _internal index, you see the actual IP address and server name of the forwarder.

0 Karma

lguinn2
Legend

Here is a search that I often use to check on how much data is being sent per hour, by forwarder.

index=_internal source=*metrics.log group=tcpin_connections 
| eval sourceHost=if(isnull(hostname), sourceHost,hostname) 
| rename connectionType as connectType
| eval connectType=case(fwdType=="uf","univ fwder", fwdType=="lwf", "lightwt fwder",fwdType=="full", "heavy fwder", connectType=="cooked" or connectType=="cookedSSL","Splunk fwder", connectType=="raw" or connectType=="rawSSL","legacy fwder")
| eval version=if(isnull(version),"pre 4.2",version)
| rename version as Ver 
| fields connectType sourceIp sourceHost destPort kb tcp_eps tcp_Kprocessed tcp_KBps splunk_server Ver
| eval Indexer= splunk_server
| eval Hour=relative_time(_time,"@h")
| stats avg(tcp_KBps) sum(tcp_eps) sum(tcp_Kprocessed) sum(kb) by Hour connectType sourceIp sourceHost destPort Indexer Ver
| fieldformat Hour=strftime(Hour,"%x %H")

Just copy this search and paste into your search box - and pick a relatively short time period (like last 24 hours or less). It should run on any Splunk 4.2 or newer. It might work on older versions, but I am not sure...

You could change the stats command if you wanted a slightly different output. For example, replace the last 3 lines with the following to get an overall summary by forwarder, rather than hour by hour statistics:

| stats avg(tcp_KBps) sum(tcp_eps) sum(tcp_Kprocessed) sum(kb) by connectType sourceIp sourceHost destPort Indexer Ver

I originally found this search as part of the Spunk Deployment Monitor. I've been tweaking it ever since.

kryten
Engager

JoeIII - thanks for this, it was just what I needed to search for forwarders that needed to be upgraded. Just one typo - sourceIP should be sourceIp:

index=_internal source=*metrics.log group=tcpin_connections | eval sourceHost=if(isnull(hostname), sourceHost,hostname) | dedup sourceHost | table sourceHost sourceIp os version | sort version

JoeIII
Path Finder

I just wanted to thank you - I modified your search to help me find out of date forwarders:

index=_internal source=*metrics.log group=tcpin_connections | eval sourceHost=if(isnull(hostname), sourceHost,hostname) | dedup sourceHost | table sourceHost sourceIP os version | sort version

gurulee
Explorer
index=_internal source=*metrics.log group=tcpin_connections | eval sourceHost=if(isnull(hostname), sourceHost,hostname) | dedup sourceHost | table sourceHost sourceIP os version | sort version

Thanks for sharing this, but when we use this search string we get duplicates, where forwarders in the results list both hostnames and a duplicate records for each as IP address. So we have two results for each forwarder, one with hostname and another with just the IP. Also, the hostname column does not populate the IP address column.

0 Karma

sephora_it
Explorer

I used... this search (above)... and added:

| stats count as vercount by version

As one of our execs wanted to know how many of each version we were running.

0 Karma

krugger
Communicator

I would suggest a query to the metadata using the search

| metadata type="hosts"

Should list the various hosts delivering you events.

If you just want the splunk forwarders you can try the following shell command:
splunk cmd btool inputs list splunktcp

lguinn2
Legend

The trouble with this, is that the hosts listed will be the name of the host specified in inputs.conf; that might or might not match the actual forwarder names.
By using the _internal index, you see the actual IP address and server name of the forwarder.

0 Karma

datasearchninja
Communicator

If you are forwarding _internal indexes from the forwarders, then the data should all be in the _internal index on your indexer.

forwardedindex.filter.disable = true in outputs.conf would achieve this.

The deployment monitor app would then show you all forwarders out of the box.

0 Karma

Splunker4Life
Explorer

Thanks for that input. I am not sure if we are forwarding _internal indexes (I'm fairly new to Splunk and am still learning my way around the software) from the forwarders but i will investigate and try it out.

Cheers

0 Karma

lguinn2
Legend

The universal forwarder does not have indexes. But it does forward its internal logs by default - so the effect is the same. And you don't need to do anything to get it.

If you are using a heavy forwarder, you will need to set it to forwarder rather than index. The following documentation is written for a search head - but the settings for a heavy forwarder will be exactly the same.
Best practice: Forward search head data into the indexing layer

0 Karma
Get Updates on the Splunk Community!

New This Month in Splunk Observability Cloud - Metrics Usage Analytics, Enhanced K8s ...

The latest enhancements across the Splunk Observability portfolio deliver greater flexibility, better data and ...

Alerting Best Practices: How to Create Good Detectors

At their best, detectors and the alerts they trigger notify teams when applications aren’t performing as ...

Discover Powerful New Features in Splunk Cloud Platform: Enhanced Analytics, ...

Hey Splunky people! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2408. In this ...