Deployment Architecture

Heavy Forwarder Not Indexing

Cole-Potter
Loves-to-Learn Lots

My End Goal: I would like to be able to leverage our windows Splunk deployment server/Splunk enterprise server to receive logs from universal forwarders and alert off events from that Splunk instance then forward the logs to Splunk cloud. 

Our current architecture includes Splunk cloud which receives events from an ubuntu forwarder which receives logs from syslog and other universal forwarders installed on windows machines across the network. 

Deployment server I believe this also forwards logs to Splunk cloud. There were some apps that required installation on a Splunk enterprise instance and we are receiving that data to cloud and the host field has the deployment server name as host. So I think some of those event are forwarded from the deployment server. I don't think those flow through the ubuntu server

I am not exactly sure where to start on trying to figure this out. I have leveraged Splunk documentation for building source inputs and really thrived off of that but I have been hammering at this making changes to outputs.conf and had no success. 

 

It does not appear that any events are being index on the Splunk Enterprise/Deployment Server instance.

 

Thank you for you help in advanced.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Cole-Potter ,

probably you're speking of an Heavy Forwarder (as in the title) and not a Deployment Server, also because if the DS has to manage more than 50 clients it requires to be installed in a dedicated server.

Anyway, it isn't a good idea to locally index your logs because in this case you have to pay Splunk Cloud and Splunk Enterprise on the HF.

The best solution is to forward logs using the HF to Splunk Cloud, andm on Splunk Cloud create an alert that monitors your data flows and sends an email if one stops.

On the HF you have to enable TCPIN inputs to receive logs from the Universal Forwarders (managed by the DS) and syslog inputs.

Only some little hints, if your resources permit:

  • use two HFs to avoid Single Points of Failure in your architecture,
  • use a Load Balancer to distribute syslog data flows between the two HFs and have HA features,
  • use rsyslog on HFs to receive syslogs (with a file input) and not Splunk network inputs.

the last one doesn't depend on resources, so apply it anyway.

Ciao.

Giuseppe

0 Karma

Cole-Potter
Loves-to-Learn Lots

First off thank you for the response,

we are sub 50 clients currently on the deployment server but very helpful information if we decide to expand. 

I probably should have been a little more specific regarding the alert. I am leveraging Splunk cloud for email alert, I would like to be able to index logs locally and forward them because I would like to be able to kick off local scripts on hosts which I was under the assumption would have to be local to the network. 

It would be limited inputs I would want to do this with like sub 5 hosts. 

Do you know what kind of licensing is required to index with Splunk Enterprise on prim? 

0 Karma

isoutamo
SplunkTrust
SplunkTrust

As said don’t use DS as HF. Those are separate roles and didn’t work well together in any bigger environment. 

@livehybrid already post splunk instructions how to do it. When you are using IHF (intermediate heavy forwarder) you always should use at least two to avoid unneeded service break when you need to do restart after config changes.

Then you should consider to use async forwarding to spread events equally over your SCP. 

As @gcusello said you shouldn’t index anything locally when you have SCP. Or if you really need it then you must buy separate splunk enterprise license for it. 

Then when you have both Linux and windows UFs you must use Linux as a DS if you want manage both platforms. It’s technically impossible manage Linux UFs correctly from windows DS. 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Cole-Potter ,

you need a Splunk Enterprise Licernse, the dimension depends on the volume of the indexed logs.

For this reason I hint to avoid to locally index.

Ciao.

Giuseppe

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @Cole-Potter 

A Splunk Deployment Server should not be used as an indexer or heavy forwarder; its primary role is to manage app deployment to Universal Forwarders.
To receive logs, search, and alert on them before forwarding to Splunk Cloud you should use a dedicated heavy forwarder or indexer - not the deployment server.

You will also need a license to ingest this data. Do you have an on-premise license in addition to your Splunk Cloud entitlement? 

You might also find the following useful regarding index and forwarding data: https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/forwarding-and-receiving-data/...

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma
Get Updates on the Splunk Community!

Index This | Why did the turkey cross the road?

November 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Feel the Splunk Love: Real Stories from Real Customers

Hello Splunk Community,    What’s the best part of hearing how our customers use Splunk? Easy: the positive ...