Getting Data In

Universal Forwarder ParsingQueue KB Size

Communicator

Hi All,

We use a Splunk Universal Forwarder to monitor Websphere log files on our environment. During normal daily operations, the Forwarder & Indexer keeps up with the amount of log data.

However, when we have system/application problems on our Websphere servers, they generate very large error messages (stacktraces) which fill up the logs.

When this happens, the current_size_kb on the parsingQueue on the Forwarder fills up (seen in the metrics.log), and all our logs get delayed. This is the worst time for it to happen as we use Splunk to troubleshoot the problems, and can't see the latest log data.

This is what shows in the metrics.log on the Forwarder:

01-13-2012 11:55:16.509 +1100 INFO  Metrics - group=queue, name=parsingqueue, max_size_kb=6144, current_size_kb=6085, current_size=100, largest_size=103, smallest_size=87

I've confirmed that none of the queues on the Indexer are filling up, and none of the other Forwarder queues are being blocked or filled.

The parsingQueue never actually shows blocked!!=true, however we just get the message "Could not send data to output queue (parsingQueue), retrying..." repeatedly.

It definitely appears to be the max_size_kb on that queue which is causing it, and I suspect its purely because of the size of the error messages that we are getting (some of them getting over 1000 lines).

Is there any way to increase the size of the max_size_kb value? I can't find any setting within limits.conf or any other files which will affect this. There must be a way to improve the performance when there are large log messages being written.

I've also reviewed what parsing we are actually doing on the forwarder, and it's very little. There are no regexes, and the only thing we do in the props.conf file is to set the sourcetype based on the filename.

Any help is appreciated.

Thanks,

Ash

1 Solution

Communicator

For anyone that's interested, I ended up logging a support call for this and they have provided the following:

You can add entries as below in the $SPLUNK/etc/system/local/server.conf file to increase the parsing queue:

[queue=parsingQueue]
maxSize = 10MB

Cheers,
Ash

View solution in original post

Communicator

For anyone that's interested, I ended up logging a support call for this and they have provided the following:

You can add entries as below in the $SPLUNK/etc/system/local/server.conf file to increase the parsing queue:

[queue=parsingQueue]
maxSize = 10MB

Cheers,
Ash

View solution in original post

Engager

thanks Ash! , this [queue=parsingQueue] helped !