Other Usage

Splunk SOAR - extract data from artifacts

JJCO
Engager

Pretty green with SOAR and haven't been able to find an good answer to this.

All of our events in SOAR are generated by pulling them in from Splunk ES.  This creates one artifact for each event.  I'm looking for a way to extract data from that artifact so we can start using and labeling that data.

Am I missing something here?  I haven't found much in the way of training on the data extraction part of this, so any tips for that would be great too.

 

0 Karma
1 Solution

marnall
Motivator

It should be set up such that:

1. A search in Splunk Enterprise has fields you find interesting
2. This search is used in the "Splunk App for SOAR Export" to send data to SOAR
3. Each result in your Splunk search should create an artifact in SOAR, and put them into a SOAR container based on the field configured in the "Splunk App for SOAR Export" to be the grouping field.
4. The artifacts will have CEF fields containing the data of the fields of your Splunk search.

Then you can run the playbooks in SOAR on your containers with the artifacts, and the playbooks can run actions using the CEF fields in your artifacts as inputs.

Can you confirm that you can view the artifact in SOAR and that it has CEF fields containing your data?

View solution in original post

0 Karma

keypuncher
New Member

If I understand you right, you want to start working with the events ingested into the SOAR platform, where your playbooks might all similarly start by retrieving each container's artifact data?

If so, I find myself relying on SOAR's code nodes more often than not to get the level of data I want.  Within your first line of custom code, you gain myriad prepopulated variables to access 'raw' data from the prior node and overall container/event data: try printing some of those parameters across the top.

Despite those params, I generally rely on REST queries to obtain artifact data, like much of the client-side code itself.  Install the HTTP app, and create an asset that points to 127.0.0.1/rest.  Make sure one of your parameters includes a REST access token/header from some User.  Then, your PBs can call that HTTP app action node to GET/PUT/POST whatever, specifically "https://..host../rest/artifact?_filter_container=#####", whose results will include a 'cef' key with the verbatim artifact(s) data available for you to directly consume, modify, or simply pass forward into future nodes.

lmk if I'm way off base, but this is generally how I manipulate individual container artifact data, inside and outside of individual playbooks.

0 Karma

marnall
Motivator

It should be set up such that:

1. A search in Splunk Enterprise has fields you find interesting
2. This search is used in the "Splunk App for SOAR Export" to send data to SOAR
3. Each result in your Splunk search should create an artifact in SOAR, and put them into a SOAR container based on the field configured in the "Splunk App for SOAR Export" to be the grouping field.
4. The artifacts will have CEF fields containing the data of the fields of your Splunk search.

Then you can run the playbooks in SOAR on your containers with the artifacts, and the playbooks can run actions using the CEF fields in your artifacts as inputs.

Can you confirm that you can view the artifact in SOAR and that it has CEF fields containing your data?

0 Karma
Get Updates on the Splunk Community!

Discover Powerful New Features in Splunk Cloud Platform: Enhanced Analytics, ...

Hey Splunky people! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2408. In this ...

Splunk Classroom Chronicles: Training Tales and Testimonials

Welcome to the "Splunk Classroom Chronicles" series, created to help curious, career-minded learners get ...

Access Tokens Page - New & Improved

Splunk Observability Cloud recently launched an improved design for the access tokens page for better ...