Other Usage

Splunk SOAR - extract data from artifacts

JJCO
Engager

Pretty green with SOAR and haven't been able to find an good answer to this.

All of our events in SOAR are generated by pulling them in from Splunk ES.  This creates one artifact for each event.  I'm looking for a way to extract data from that artifact so we can start using and labeling that data.

Am I missing something here?  I haven't found much in the way of training on the data extraction part of this, so any tips for that would be great too.

 

0 Karma
1 Solution

marnall
Motivator

It should be set up such that:

1. A search in Splunk Enterprise has fields you find interesting
2. This search is used in the "Splunk App for SOAR Export" to send data to SOAR
3. Each result in your Splunk search should create an artifact in SOAR, and put them into a SOAR container based on the field configured in the "Splunk App for SOAR Export" to be the grouping field.
4. The artifacts will have CEF fields containing the data of the fields of your Splunk search.

Then you can run the playbooks in SOAR on your containers with the artifacts, and the playbooks can run actions using the CEF fields in your artifacts as inputs.

Can you confirm that you can view the artifact in SOAR and that it has CEF fields containing your data?

View solution in original post

0 Karma

keypuncher
New Member

If I understand you right, you want to start working with the events ingested into the SOAR platform, where your playbooks might all similarly start by retrieving each container's artifact data?

If so, I find myself relying on SOAR's code nodes more often than not to get the level of data I want.  Within your first line of custom code, you gain myriad prepopulated variables to access 'raw' data from the prior node and overall container/event data: try printing some of those parameters across the top.

Despite those params, I generally rely on REST queries to obtain artifact data, like much of the client-side code itself.  Install the HTTP app, and create an asset that points to 127.0.0.1/rest.  Make sure one of your parameters includes a REST access token/header from some User.  Then, your PBs can call that HTTP app action node to GET/PUT/POST whatever, specifically "https://..host../rest/artifact?_filter_container=#####", whose results will include a 'cef' key with the verbatim artifact(s) data available for you to directly consume, modify, or simply pass forward into future nodes.

lmk if I'm way off base, but this is generally how I manipulate individual container artifact data, inside and outside of individual playbooks.

0 Karma

marnall
Motivator

It should be set up such that:

1. A search in Splunk Enterprise has fields you find interesting
2. This search is used in the "Splunk App for SOAR Export" to send data to SOAR
3. Each result in your Splunk search should create an artifact in SOAR, and put them into a SOAR container based on the field configured in the "Splunk App for SOAR Export" to be the grouping field.
4. The artifacts will have CEF fields containing the data of the fields of your Splunk search.

Then you can run the playbooks in SOAR on your containers with the artifacts, and the playbooks can run actions using the CEF fields in your artifacts as inputs.

Can you confirm that you can view the artifact in SOAR and that it has CEF fields containing your data?

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...