Splunk Enterprise

Resource issues with indexing and queues

d_lim
Path Finder

Hi all, so I am facing this issue with what seems to be delayed/not receiving logs from the UFs.

d_lim_1-1613984409525.png

This is the current queue, we have gone from 1, to 2, to now 3 for the indexer's parallelIngestionPipeline settings.

On the indexer, the index queue is always full and is affecting the downstream from the 2 HFs. 

There are about 16 intermediate forwarders sending to HF001, and HF002 is mainly doing API calls to pull data.

The iops for the indexer is around 1600, cpu usage 50% and memory 31%.

Any recommendations on what we can do to improve this, eg. additional indexer? Thanks.

Labels (3)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

I full indexing queue means the act of writing to disk is taking too long.  Adding pipelines just makes that worse by creating more threads that try to write to disk.  Something in the storage system is causing delays and correcting that problem should alleviate the queue problem.

---
If this reply helps you, Karma would be appreciated.

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

I full indexing queue means the act of writing to disk is taking too long.  Adding pipelines just makes that worse by creating more threads that try to write to disk.  Something in the storage system is causing delays and correcting that problem should alleviate the queue problem.

---
If this reply helps you, Karma would be appreciated.
Get Updates on the Splunk Community!

Splunk at Cisco Live 2025: Learning, Innovation, and a Little Bit of Mr. Brightside

Pack your bags (and maybe your dancing shoes)—Cisco Live is heading to San Diego, June 8–12, 2025, and Splunk ...

Splunk App Dev Community Updates – What’s New and What’s Next

Welcome to your go-to roundup of everything happening in the Splunk App Dev Community! Whether you're building ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...