Getting Data In

queue are getting blocked

ips_mandar
Builder

I have one Heavy forwarder and one indexer+search head. I am monitoring (high amount of) zip files in heavy forwarder and parsing it using indexqueue and null queue to reduce number of logs to reduce license cost. I am using Heavy forwarder used
I am getting multiple queues blocked specially aggqueue(high number of times) though I have increased queue size but doesn't affect much.
My Heavy forwarder has specification- windows OS , 64 gb RAM, 4 core

also I am using below in props.conf- line_breaker,should_linemerge,time_prefix,max_timestamp_lookahead, max_days_ago, time_format
1. I know splunk will index zip files as single threaded so does increasing core will reduce queue blockage?
2. also which queue is used to transfer files from heavy forwarder to indexer for indexing? so that I can check if that queue is getting full.
3.what can be done to resolve queue blockage? all queues are getting blocked at heavy forwarder side only.

Thanks,

0 Karma

HiroshiSatoh
Champion

You can see the role of queues here.

https://wiki.splunk.com/Community:HowIndexingWorks

Since there is only one process, the queue accumulates when processing large logs. Even if the number of cores increases, improvement cannot be expected.

Splunk can multiplex processes. See the link below.

https://conf.splunk.com/files/2016/slides/harnessing-performance-and-scalability-with-parallelizatio...

Multiplexing is limited to two. Professional service is required when setting to 3 multiplex.

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...