All Apps and Add-ons
Highlighted

Issue with rest.py not closing and running out of memory

Explorer

Im seeing an issue when configuring my input with OAuth2 where rest.py does not close out after being executed eventually leading up to memory running out and splunk crashing.

Has anyone else experienced this problem and know how to fix it?

root 9755 1 0 12:47 ? 00:00:01 python /opt/splunk/etc/apps/restta/bin/rest.py
root 9768 1 0 12:47 ? 00:00:00 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 10128 1 0 14:47 ? 00:00:00 python /opt/splunk/etc/apps/restta/bin/rest.py
root 10134 1 0 14:47 ? 00:00:01 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 10140 1 0 14:47 ? 00:00:00 python /opt/splunk/etc/apps/restta/bin/rest.py
root 10146 1 0 14:47 ? 00:00:01 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 26442 1 0 13:47 ? 00:00:00 python /opt/splunk/etc/apps/restta/bin/rest.py
root 26449 1 0 13:47 ? 00:00:01 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 28087 1 0 15:47 ? 00:00:00 python /opt/splunk/etc/apps/restta/bin/rest.py
root 28093 1 0 15:47 ? 00:00:01 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 28105 1 0 15:47 ? 00:00:00 python /opt/splunk/etc/apps/restta/bin/rest.py
root 28118 1 0 15:47 ? 00:00:01 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 28123 9306 0 15:47 ? 00:00:00 /bin/sh -c python /opt/splunk/etc/apps/restta/bin/rest.py
root 28124 28123 0 15:47 ? 00:00:00 python /opt/splunk/etc/apps/rest
ta/bin/rest.py
root 28129 9306 0 15:47 ? 00:00:00 /bin/sh -c python /opt/splunk/etc/apps/restta/bin/rest.py
root 28130 28129 0 15:47 ? 00:00:01 python /opt/splunk/etc/apps/rest
ta/bin/rest.py

Tags (1)
Highlighted

Re: Issue with rest.py not closing and running out of memory

Engager

I'm having the exact same issue. The one other thing I notice is that when my input is scheduled to run, I get events in my log growing exponentially each time my input is cron'ed to run.

Do you also have the AWS TA installed (SplunkTAaws)? If so, try this query:

host=inserthostnamehere index="_internal" | stats count by message

I think there may be some kind of incompatibility between the two apps. These are some of the values I get... and they start going up into the thousands after my rest_ta input has run a few times.

interval: 30000 ms
interval: 60000 ms
interval: run once
New scheduled exec process: /opt/splunk/bin/splunkd instrument-resource-usage
New scheduled exec process: python /opt/splunk/etc/apps/Splunk_TA_aws/bin/aws_cloudwatch.py
New scheduled exec process: python /opt/splunk/etc/apps/Splunk_TA_aws/bin/aws_config_rule.py
New scheduled exec process: python /opt/splunk/etc/apps/Splunk_TA_aws/bin/aws_description.py
New scheduled exec process: python /opt/splunk/etc/apps/Splunk_TA_aws/bin/aws_inspector.py
New scheduled exec process: python /opt/splunk/etc/apps/Splunk_TA_aws/bin/aws_kinesis.py
New scheduled exec process: python /opt/splunk/etc/apps/Splunk_TA_aws/bin/aws_s3.py

If you find anything, please share.

0 Karma
Highlighted

Re: Issue with rest.py not closing and running out of memory

Engager

Changing rest.py like the following:

From:

    while True:

        if polling_type == 'cron':
            next_cron_firing = cron_iter.get_next(datetime)
            while get_current_datetime_for_cron() != next_cron_firing:
                time.sleep(float(1))

To:

    if True:

Explanation:
Splunk already fires off the script every 5 minutes (by default... see "interval" in inputs.conf). It seems rest.py tries to incorporate its own scheduling, while ignoring Splunk's.

It seems the restta creates it's own "pollinginterval", and the loop ("while True:") never exits. Therefore every 5 minutes you get a new instance of rest.py which never exits.

By changing "while True:" to "if True:", you get rid of the loop and don't have to fix all the indenting. I got rid of the cron logic, too, because if the current time is ever later than the next firing, it will be another endless loop. If you want to control the timing, add an "interval" to the rest input in /opt/splunk/etc/apps/search/local/inputs.conf.

0 Karma