Okay, this solution may not be ideal, but I've spent a bunch of time trying to get this to work and so far it's okay. I just haven't done too many tests yet to see if this will be 100% reliable, I feel like multiple findings in a very short time frame will break this solution and i'll need to figure out something else. I am no longer relying on the webhook Adaptive Response as I wasn't able to get the information I needed about the finding to be included, even with custom add-ons that provide better options with deciding what goes into the JSON body. I moved along to triggering Python code which interacts with Splunk first, then sends my information out to my automation platform The python code does an external query to splunk for key details I wanted about the most recent notable event, that contains the title for the certain types of findings I wanted this task to run with. This external query passes this information along to my automation platform, along with the variables I needed most importantly the finding's unique identified - source_event_id Upon completion of the automation task, the results of this task are sent back to splunk via a HTTP POST Request, which places text into the notes section via the finding's source_event_id Now all along the way there were steps to be taken in terms of generating authentication tokens within splunk, and authenticating both ways with Splunk and my automation platform. Also dealing with creation of self signed certificates to be installed on both ends, and also installing proper ssh tokens on a third party machine used for powershell queries. It was quite a bit of work, I'm not even sure I fully could document it all without going through each section. But happy to provide specific details if any of this seems relevant to you and you need more information. Good Luck!
... View more