Splunk SOAR (f.k.a. Phantom)

Retention Policy for containers in Phantom

ansusabu
Communicator

Do we have retention policy for the containers in Phantom? When there are huge number of containers in Phantom, it consumes lots of memory and the playbooks won't run.

Labels (2)
Tags (2)
0 Karma
1 Solution

cblumer_splunk
Splunk Employee
Splunk Employee

Here's where you can find the phantom crontab:
sudo cat /etc/cron.d/phantom

A script like this could be used to backup & purge:

import os
import requests
from datetime import date
from dateutil.relativedelta import relativedelta
from subprocess import Popen,PIPE

# backup phantom to nfs share
os.system("sudo phenv python2.7 /opt/phantom/bin/backup.pyc --all --backup-path /freenas_nfs/")

# query for containers to purge (created more than 30 days ago and closed status)
host = '127.0.0.1'
token = 'xxxxKnUNHLoXzBPxDTOWSwVcpWGuOwMYfZARBMlscnw='
headers = {"ph-auth-token": token}
# disable certificate warnings for self signed certificates
requests.packages.urllib3.disable_warnings()

six_months = date.today() - relativedelta(months=1)

r = requests.get('https://{}/rest/container?_filter_status="closed"&_filter_create_time__lt="{}"&page_size=0'.format(host, six_months),
                 headers=headers, verify=False)
containers = r.json().get('data')

ids = []

for i in containers:
    c_id = i.get('id')
    ids.append(c_id)

id_csv = ','.join(map(str, ids))

# delete containers
del_script = Popen(['sudo', 'phenv', 'python2.7', '/opt/phantom/bin/delete_containers.pyc', '-i', '{}'.format(id_csv)], stdin=PIPE, stdout=PIPE)
del_script.communicate(input='y')

View solution in original post

cblumer_splunk
Splunk Employee
Splunk Employee

Example cron entry:

Run the purge script every day at 2:15am
15 2 * * * phenv python2.7 /opt/phantom/bin/Purge_Containers.py

cblumer_splunk
Splunk Employee
Splunk Employee

Here's where you can find the phantom crontab:
sudo cat /etc/cron.d/phantom

A script like this could be used to backup & purge:

import os
import requests
from datetime import date
from dateutil.relativedelta import relativedelta
from subprocess import Popen,PIPE

# backup phantom to nfs share
os.system("sudo phenv python2.7 /opt/phantom/bin/backup.pyc --all --backup-path /freenas_nfs/")

# query for containers to purge (created more than 30 days ago and closed status)
host = '127.0.0.1'
token = 'xxxxKnUNHLoXzBPxDTOWSwVcpWGuOwMYfZARBMlscnw='
headers = {"ph-auth-token": token}
# disable certificate warnings for self signed certificates
requests.packages.urllib3.disable_warnings()

six_months = date.today() - relativedelta(months=1)

r = requests.get('https://{}/rest/container?_filter_status="closed"&_filter_create_time__lt="{}"&page_size=0'.format(host, six_months),
                 headers=headers, verify=False)
containers = r.json().get('data')

ids = []

for i in containers:
    c_id = i.get('id')
    ids.append(c_id)

id_csv = ','.join(map(str, ids))

# delete containers
del_script = Popen(['sudo', 'phenv', 'python2.7', '/opt/phantom/bin/delete_containers.pyc', '-i', '{}'.format(id_csv)], stdin=PIPE, stdout=PIPE)
del_script.communicate(input='y')

cblumer_splunk
Splunk Employee
Splunk Employee

It is by design that there is no data retention policy imposed by the platform.. however there are scripts provided to allow an Admin to create backups and purge Containers from the database. If you were looking to automate a purging job you could add an additional line to the crontab config on the Phantom server to have the services stopped, a delete job executed, and the services restarted on a defined schedule.

Backup Script:
https://my.phantom.us/kb/62/

Warm Standby replication method
https://my.phantom.us/kb/64/

Delete Containers Script
https://my.phantom.us/kb/80/

ansusabu
Communicator

where is this crontab config ?

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...