Getting Data In

queue are getting blocked

ips_mandar
Builder

I have one Heavy forwarder and one indexer+search head. I am monitoring (high amount of) zip files in heavy forwarder and parsing it using indexqueue and null queue to reduce number of logs to reduce license cost. I am using Heavy forwarder used
I am getting multiple queues blocked specially aggqueue(high number of times) though I have increased queue size but doesn't affect much.
My Heavy forwarder has specification- windows OS , 64 gb RAM, 4 core

also I am using below in props.conf- line_breaker,should_linemerge,time_prefix,max_timestamp_lookahead, max_days_ago, time_format
1. I know splunk will index zip files as single threaded so does increasing core will reduce queue blockage?
2. also which queue is used to transfer files from heavy forwarder to indexer for indexing? so that I can check if that queue is getting full.
3.what can be done to resolve queue blockage? all queues are getting blocked at heavy forwarder side only.

Thanks,

0 Karma

HiroshiSatoh
Champion

You can see the role of queues here.

https://wiki.splunk.com/Community:HowIndexingWorks

Since there is only one process, the queue accumulates when processing large logs. Even if the number of cores increases, improvement cannot be expected.

Splunk can multiplex processes. See the link below.

https://conf.splunk.com/files/2016/slides/harnessing-performance-and-scalability-with-parallelizatio...

Multiplexing is limited to two. Professional service is required when setting to 3 multiplex.

0 Karma
Get Updates on the Splunk Community!

Application management with Targeted Application Install for Victoria Experience

  Experience a new era of flexibility in managing your Splunk Cloud Platform apps! With Targeted Application ...

Index This | What goes up and never comes down?

January 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Splunkers, Pack Your Bags: Why Cisco Live EMEA is Your Next Big Destination

The Power of Two: Splunk + Cisco at "Ludicrous Scale"   You know Splunk. You know Cisco. But have you seen ...