Getting Data In

High memory/cpu usage by splunk universal forwarder

NReddy12
Loves-to-Learn Lots

I've installed Splunk Universal Forwarder 9.1.0 on a Linux server and configured batch mode for data log file monitoring. There are different types of logs which we monitoring with different filenames. We observed too much CPU/Memory consumption by splunkd process when the input log files to be monitored are more ( > 1000K approx). All the input data logs files are new and total no. of events range would be 10 to 300.

Few metirc logs:

{"level":"INFO","name":"splunk","msg":"group=tailingprocessor, ingest_pipe=1, name=batchreader1, current_queue_size=0, max_queue_size=0, files_queued=0, new_files_queued=0","service_id":"infra/service/ok6qk4zudodbld4wcj2ha4x3fckpyfz2","time":"04-08-2024 20:33:20.890 +0000"}
{"level":"INFO","name":"splunk","msg":"group=tailingprocessor, ingest_pipe=1, name=tailreader1, current_queue_size=1388185, max_queue_size=1409382, files_queued=18388, new_files_queued=0, fd_cache_size=63","service_id":"infra/service/ok6qk4zudodbld4wcj2ha4x3fckpyfz2","time":"04-08-2024 20:33:20.890 +0000"}

 

Please help me if there is any configuration tuning to limit the number of files to be monitored.

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Wait, wait, wait. Do you mean that your UF has to keep track of over a million files? That can have a huge memory footprint. Also polling directories containing all those files can be intensive. And not much tuning can help here.

Side note - are you sure you need to use batch input? You're showing events from tailingprocessor which is used with monitor inputs.

0 Karma

JohnEGones
Communicator

Check your inputs.conf and ensure the stanzas are properly configured to monitor only the files that you want, specifically you can adjust the block and allow lists:

[monitor:// whatever]
whitelist = ( REGEX )
blacklist = ( REGEX )

 

That aside, I strongly encourage you to follow Giuseppe's advice and contact your Splunk admin to open a case on your behalf.

0 Karma

NReddy12
Loves-to-Learn Lots

@gcusello, I don't have access to open a case to Splunk support.

it would be much appreciated if someone could help how to limit the monitoring files and control the memory consumption.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @NReddy12,

I never experienced this behavior on a Linux server.

The only hint is to open a case to Splunk Support, sending them a diag of your Universal Forwarder.

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Celebrate CX Day with Splunk: Take our interactive quiz, join our LinkedIn Live ...

Today and every day, Splunk celebrates the importance of customer experience throughout our product, ...

How to Get Started with Splunk Data Management Pipeline Builders (Edge Processor & ...

If you want to gain full control over your growing data volumes, check out Splunk’s Data Management pipeline ...

Out of the Box to Up And Running - Streamlined Observability for Your Cloud ...

  Tech Talk Streamlined Observability for Your Cloud Environment Register    Out of the Box to Up And Running ...