Getting Data In

Splunk Architecture for Production

meenal901
Communicator

Hi,
We have 140 production servers, where we are planning to install universal forwarders.
Further we need to do processing to filter out data and send around 50 perc data to the indexers.
Each production server is producing around 1.5 GB of data .

With this much data volume and server count. What should be the number of heavy forwarders, indexers and search heads we should be using.

0 Karma

MuS
SplunkTrust
SplunkTrust

Hi meenal901,

this cannot be answered here; it all depends on your existing infrastructure, your use cases and other requirements like how many concurrent search will run, do you depend on live near real-time data and so on.

As a rule of thumb take a look at the docs about recommended hardware which should be good to index about 100Gb/day.

cheers, MuS

martin_mueller
SplunkTrust
SplunkTrust

Additionally, the effort to perform the 50% filtering you mentioned depends heavily on how the filters are built. Very simple filters won't have a huge impact while complex (badly built, usually) filters can make your servers grind to a halt.
Therefore it's impossible to say based on just a few numbers how many HFs you need, whether it'd make sense to use HFs at the sources instead of UFs, whether it'd make sense to send 100% to the indexers and filter there (network? legal issues?), and so on.

Schedule a workshop with your local Splunk Partner or Splunk Sales Engineer.

Get Updates on the Splunk Community!

September Community Champions: A Shoutout to Our Contributors!

As we close the books on another fantastic month, we want to take a moment to celebrate the people who are the ...

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...