Splunk Enterprise Security

How do I push or pull from Enterprise Security notable events?

alpsholic
Explorer

I have a scenario which I can explain with an example. I am implementing a 3rd party service which takes action based on notable events in Splunk Enterprise Security.

For example, every time there is a new "Geographically Improbable Access Detected" notable event, I want to extract the user details and process them.

What is the best way to get notified by Splunk?

(1) Is it that I run a query remotely using Splunk REST API regularly for the relevant notable events?

(2) Is there a way Splunk can invoke my REST end point every time there is a new relevant event?

(3) Splunk alerts + webhook? (this way, I think I can get only first matching relevant event instead of all).

Thanks a ton in advance

0 Karma
1 Solution

LukeMurphey
Champion

Here are some thoughts

(1) Is it I run a query remotely using Splunk REST API regularly for the relevant notable events?

You can do this by remotely a search. I wrote a script showing how this can be done using the Python library that is built into Splunk: https://gist.github.com/LukeMurphey/cbd8a4093e2a9e922038117cd4eceb00. You ought to be able to reuse the get_notables() function to get what you need.

You can also use the Splunk Python SDK to do this.

(2) Is there a way Splunk can invoke my REST end point every time there is a new relevant event?

There are multiple ways you could do this. I think you could do this using a search command which would run in a search that examines the notable index (via the notable macro, notable).

You can see an example of how to write a custom search command here: https://github.com/LukeMurphey/splunk-search-command-example

In the end, you would have a search that looks something like this:

`notable` | mysearchcommand

You would want to make sure that the search does not overlap in execution in order to prevent it from processing the same results repeatedly.

(3) Splunk alerts + webhook? (this way, I think I can get only first matching relevant event instead of all).

If it were me, I would likely choose this route. I think this would avoid having to do any coding. I think you could get it trigger for each result by changing the value for the trigger to "For each result".

View solution in original post

LukeMurphey
Champion

Here are some thoughts

(1) Is it I run a query remotely using Splunk REST API regularly for the relevant notable events?

You can do this by remotely a search. I wrote a script showing how this can be done using the Python library that is built into Splunk: https://gist.github.com/LukeMurphey/cbd8a4093e2a9e922038117cd4eceb00. You ought to be able to reuse the get_notables() function to get what you need.

You can also use the Splunk Python SDK to do this.

(2) Is there a way Splunk can invoke my REST end point every time there is a new relevant event?

There are multiple ways you could do this. I think you could do this using a search command which would run in a search that examines the notable index (via the notable macro, notable).

You can see an example of how to write a custom search command here: https://github.com/LukeMurphey/splunk-search-command-example

In the end, you would have a search that looks something like this:

`notable` | mysearchcommand

You would want to make sure that the search does not overlap in execution in order to prevent it from processing the same results repeatedly.

(3) Splunk alerts + webhook? (this way, I think I can get only first matching relevant event instead of all).

If it were me, I would likely choose this route. I think this would avoid having to do any coding. I think you could get it trigger for each result by changing the value for the trigger to "For each result".

alpsholic
Explorer

In (2) I did not get how Splunk can send the results to my REST end point by itself? Say something like, I tell Splunk "Send every new 'Geographically Improbable Access Detected' event to https://myapp.com/newevent"??

0 Karma

LukeMurphey
Champion

For number 2, I was thinking you would be writing a custom search command in Python. Thus, it wouldn't be doing it without coding.

For this reason, I would try to get number three to work first.

0 Karma
Get Updates on the Splunk Community!

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...

Explore the Latest Educational Offerings from Splunk (November Releases)

At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of ...