I have configured amqp modular data input to consume data from a rabbitmq queue. I have increased my heap maximum from 64M upto 1024M and still tend to hit the heap issue. After which the streaming to splunk stops until the data input monitor is disabled and then re-enabled.
Are we hitting a memory leak issue? Is there a way I can ensure I continue streaming in the data without hitting the heap issue.
ERROR ExecProcessor - message from "python /local/mnt/splunk/etc/apps/amqp_ta/bin/amqp.py" Exception in thread "Thread-3" java.lang.OutOfMemoryError: GC overhead limit exceeded
... View more