Archive2

What is Splunk's footprint?

Splunk Employee
Splunk Employee

What is the footprint for a Splunk Forwarder, a Light-Weight Forwarder, and a Splunk Indexer?

Tags (1)
1 Solution

Splunk Employee
Splunk Employee

For indexer questions, see: http://www.splunk.com/wiki/Community:Planning_your_Splunk_deployment

For general discussions of footprint and forwarder footprint see: http://www.splunk.com/wiki/Deploy:ForwarderBestPractice http://www.splunk.com/wiki/Community:MinimizingForwarderFootprint

Classically, light forwaders used around 35-38MB on start, possibly growing towards 50MB on indexer backup. A full forwarder's memory use is going to vary fairly heavily on event size. Other characteristics like CPU are much harder to quantify because they vary heavily upon data volume, data type, processing applied, and so on.

For this type of question, I would instead create a community page to start rolling up this type of information, and gather what you aready have.

View solution in original post

Splunk Employee
Splunk Employee

This is a common question for any software vendor. In general, we should be able to provide a high-level answer, which it looks like you've done below, so thank you.

0 Karma
Reply

Splunk Employee
Splunk Employee

Too vague a question, with far too unbounded an answer. Narrow it down to some specific things.

0 Karma
Reply

Contributor

-2 votes? Wow, tough crowd. Seems like a reasonable question to me... hey, let's be nice to the Answers newbies! +1

0 Karma
Reply

Splunk Employee
Splunk Employee

For indexer questions, see: http://www.splunk.com/wiki/Community:Planning_your_Splunk_deployment

For general discussions of footprint and forwarder footprint see: http://www.splunk.com/wiki/Deploy:ForwarderBestPractice http://www.splunk.com/wiki/Community:MinimizingForwarderFootprint

Classically, light forwaders used around 35-38MB on start, possibly growing towards 50MB on indexer backup. A full forwarder's memory use is going to vary fairly heavily on event size. Other characteristics like CPU are much harder to quantify because they vary heavily upon data volume, data type, processing applied, and so on.

For this type of question, I would instead create a community page to start rolling up this type of information, and gather what you aready have.

View solution in original post

Splunk Employee
Splunk Employee

Per your suggestion, I created a Splunk Wiki page regarding this topic.

You can find it here:

http://www.splunk.com/wiki/Community:What_is_splunks_footprint

Please feel free to edit this page and contribute to the overall discussion.

0 Karma
Reply

Splunk Employee
Splunk Employee

Here is a post about one possible aspect of this, memory:
http://blogs.splunk.com/2010/02/03/splunk-memory-use-patterns/

My objection is by folding every type of resource use into one noun, it presents a false simplicity of the discussion, which cannot be met with a simple answer while being honestly helpful.

I would ask instead "what resource constraints are typically encountered for a light forwarder, for an indexer, and for a search head?"

0 Karma
Reply

Splunk Employee
Splunk Employee

It is a broad question, but if we could baseline the ootb footprint for the forwarder it would go a long way in helping to set expectations. Yes, I understand there is variability even when we are talking about the ootb forwarder depending on the load and activity on the host server, but average or ballpark numbers would be a good start. Then an overview of how the footprint can be expected to change/grow will help with projections beyond the baseline.

0 Karma
Reply

Splunk Employee
Splunk Employee

Generally speaking, we care about all of it and maybe we don't need white papers, necessarily. Just ranges and estimates/guesstimates, etc. Something that customers (who will not evaluate Splunk beyond basic functional testing) can get an idea or sense of what to expect when they deploy Splunk into production and/or for expansion and capacity planning, etc. Make sense?

0 Karma
Reply

Splunk Employee
Splunk Employee

The problem is that I don't know what the question is!

Do we care about disk? Memory? CPU? Network utilization? I/O Bandwidth? What are the usage patterns? What sort of scaling information is desired? This could be the topic of 6 whitepapers.

0 Karma
Reply

Contributor

Hmm, I think this is a great question to have in here-- it's pretty typical of what new Splunk users will need to know. The fact that answering the question requires linking out to another place doesn't mean that the question doesn't belong!

State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!