Splunk SOAR

How to pass data from one playbook to its subplaybook

ansusabu
Communicator

How can we pass data from one playbook to its sub playbook?

Labels (1)
0 Karma
1 Solution

cblumer_splunk
Splunk Employee
Splunk Employee

There are two good methods for passing data between playbooks in Phantom, and many other ways to use external databases /services/or APIs to pass data between playbooks and containers. These are the built-in methods:

Phantom App - 'Add Artifact' action
https://my.phantom.us/4.5/docs/app_reference/phantom_phantom#add-artifact
https://my.phantom.us/4.5/docs/automation/api_container#add_artifact

This method can be used to store (JSON formatted) data within a newly created Artifact either in the Container the playbook is being executed on at the time, or any Container across the local system as long as the ID is known. You can also create a new Container on any other connected Phantom instance as well, as long as the required App Asset is created and configured with the connection details on the local instance. Best used for data that should be stored long-term.

phantom.save_object()
https://my.phantom.us/4.5/docs/automation/api_data_management#save_object

"This API allows you to save data by key and/or container_id and/or playbook name, to be retrieved when executing playbooks on containers. Users can save a key and value pair along with the context of container id and/or playbook name."

Best used for data which can be deleted when the given container is closed using the 'clear_object' field:

"[optional] Defaults to False. If set True, the data will be deleted when the container is closed. The clear_object can be used to delete the data. If set True, container_id must be provided."

View solution in original post

diogofgm
SplunkTrust
SplunkTrust

what do you mean playbook and subplaybook? is this regarding ansible?

------------
Hope I was able to help you. If so, some karma would be appreciated.
0 Karma

cblumer_splunk
Splunk Employee
Splunk Employee

Phantom playbooks. It is recommended to use a Module Design (Parent & Children) when creating SOAR use cases with Phantom.

https://my.phantom.us/4.5/docs/vpe/editor#playbook
https://my.phantom.us/4.5/docs/automation/api_playbook#playbook

My answer will be posted here shortly.

0 Karma

diogofgm
SplunkTrust
SplunkTrust

sorry. its was not tagged as phantom xP

------------
Hope I was able to help you. If so, some karma would be appreciated.
0 Karma

cblumer_splunk
Splunk Employee
Splunk Employee

There are two good methods for passing data between playbooks in Phantom, and many other ways to use external databases /services/or APIs to pass data between playbooks and containers. These are the built-in methods:

Phantom App - 'Add Artifact' action
https://my.phantom.us/4.5/docs/app_reference/phantom_phantom#add-artifact
https://my.phantom.us/4.5/docs/automation/api_container#add_artifact

This method can be used to store (JSON formatted) data within a newly created Artifact either in the Container the playbook is being executed on at the time, or any Container across the local system as long as the ID is known. You can also create a new Container on any other connected Phantom instance as well, as long as the required App Asset is created and configured with the connection details on the local instance. Best used for data that should be stored long-term.

phantom.save_object()
https://my.phantom.us/4.5/docs/automation/api_data_management#save_object

"This API allows you to save data by key and/or container_id and/or playbook name, to be retrieved when executing playbooks on containers. Users can save a key and value pair along with the context of container id and/or playbook name."

Best used for data which can be deleted when the given container is closed using the 'clear_object' field:

"[optional] Defaults to False. If set True, the data will be deleted when the container is closed. The clear_object can be used to delete the data. If set True, container_id must be provided."

ankitsync
Explorer

Hi there,

I have implemented the artifact creation way of sharing data with multiple playbooks on the same container. It works perfect when run in debug mode both with scope new and all. But when the playbook is made active, the actions on the playbook can't read  those artifacts as the default scope when a playbook is set to active is 'new'. Please help how can we set the scope global for entire playbook when in active mode. 

BR,

Ankit

0 Karma

phanTom
SplunkTrust
SplunkTrust

@ankitsync the only way to change the scope of a playbook is to run it using REST as you can set the scope in the call:

https://docs.splunk.com/Documentation/Phantom/4.9/PlatformAPI/RESTPlaybook 

There is also the option of setting the scope of the point of collection (collect2, etc) in the other playbook by adding scope='all' to the relevant call. Check the docs for which ones have the capability. A few are collect2, format, condition, decision, and many more. 

https://docs.splunk.com/Documentation/Phantom/4.9/PlaybookAPI/DataAccessAPI#collect2 

NOTE: By changing the scope setting in a default API call it will change the block to a 'custom' block, so be aware of that when/if using this method. 

ankitsync
Explorer

Hi , Thanks for sharing I did research about both these methods.  While the custom a parameter addition to blocks is something i have kept in mind as a last resort. Is there a way to make that rest call from another playbook ? 

0 Karma

phanTom
SplunkTrust
SplunkTrust

@ankitsync you can either:

1. Use the phantom APIs phantom.build_phantom_rest_url() & phantom.requests() to run RESTS calls in Phantom playbooks. You can use in either of the custom function types.
https://docs.splunk.com/Documentation/Phantom/4.10/PlaybookAPI/SessionAPI 

2. You can use the HTTP app in playbooks to speak with Phantom REST. This is sometimes better as there is no custom code required but can require a few format blocks to make the url & any payload for POST actions and needs an asset configured for the App. 

 https://my.phantom.us/4.10/docs/app_reference/phantom_http 

Get Updates on the Splunk Community!

What's New in Splunk Enterprise 9.4: Features to Power Your Digital Resilience

Hey Splunky People! We are excited to share the latest updates in Splunk Enterprise 9.4. In this release we ...

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...

SignalFlow: What? Why? How?

What is SignalFlow? Splunk Observability Cloud’s analytics engine, SignalFlow, opens up a world of in-depth ...