Getting Data In

How can we handle forwarders on highly-utilized servers?

ddrillic
Ultra Champion

We have all kinds of issues when a forwarder is installed on a highly-utilized server, such as a DB Linux server due to running out of resources on this type of servers. I wonder which alternatives we can use for the forwarder in such cases.

Tags (1)

ddrillic
Ultra Champion

Btw, is there a way to limit the amount of memory Splunk uses on a server?

0 Karma

esix_splunk
Splunk Employee
Splunk Employee

What are you monitoring on this server with the Forwarder? Are you running the NIX TA to get CPU / Mem / Disk performance metrics? Or are you just reading log files?

If you're collecting metrics from the OS, you might look at collectd as an alternative for collecting the metrics, and send that to Splunk.

If you're just reading log files, you could setup syslog to forward those logs to another server and then use a forwarder there to read the files.

ddrillic
Ultra Champion

-- If you're just reading log files, you could setup syslog to forward those logs to another server and then use a forwarder there to read the files.

Very interesting.

0 Karma

dedwards93
New Member

Try to use the least wildcards as possible in your inputs.conf
The more wildcards you use to tell the forwarder where to search for logs, the more resources it needs.

Avoid uses of /.../ as much as you can. This will reduce the amount of resource the forwarder requires.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...