Splunk Enterprise Security

How to insert ESCU detections via REST API into Splunk ESS?

koshyk
Super Champion

We have automation to insert  /saved/searches endpoint and all is good.  Also current have quite lot of custom Splunk Enterprise Security (ESS) event-based detections handcrafted via the GUI in splunk cloud. (So can't directly put into savedsearches.conf)

We have to automate these as they are not pure 'savedsearches'. We are following the ESCU  standards and use contentctl validate. All good till this stage

But how to insert the ESCU detections into Splunk ESS? Which app to insert into? (SplunkEnterpriseSecuritySuite or DA-ESS-* type apps or can it be inserted into our own custom app itself?)

Any API based automation into Splunk ESS is deeply appreciated

thanks in advance

Labels (1)
0 Karma

koshyk
Super Champion

thanks @livehybrid . Upvoted.

I almost figured it out, but in a slightly different manner. I'm got an ansible setup for URL interaction and automation. The 'contentctl build' will produce artefact similar to a Splunk app with `savedsearches.conf` and other things like `analyticsstories.conf`

contentctl build --path content --app.title MY_DETECT --app.appid DA-ESS-MY_DETECT --app.prefix MY --app.label MY

Then i'm using the ansible automation which interacts with saved/searches and other endpoints to insert it back.

Two things i'm still figuring out is

  • it is slow once the savedsearches have 50+ searches as it runs one by one
  • contentctl new : this option doesn't accept ALL parameters like search, name which means a user input is required
  • Any chance for automation can detect if a savedsearch is Changed, then only insert

 

Update:  Able to insert into system after contentctl using REST api "saved/searches".  Though the type is specified as 'ebd' (event-based detection), while it is inserted into Splunk, it becomes a 'saved search' type !!

any solutions/recommendations for this?

0 Karma

livehybrid
Super Champion

Hi @koshyk 

I did a PoC with a customer around this a few months ago and we had great success using the regular saved/search endpoint to create searches from a custom app created with contentctl. 

We used a custom app which we uploaded the searches into, this way we are following the same pattern as creating a custom app from contentctl which would be uploaded. 

ES rules are essentially scheduled searches, with modular alerts, they just show in ES because of some of the settings in them, from memory these might be action.correlationsearch.* and action.notable.param.* settings. 

I think the main gotcha that caught is out is that in the savedsearches.conf we use "enabledSched = 1" to schedule a search, however the API expects "is_scheduled =1" (WHY?!?)

Check my response below to the following post which has a sample python code to upload a savedsearch if it helps - https://community.splunk.com/t5/All-Apps-and-Add-ons/Custom-app-in-cloud/m-p/745079


@livehybrid wrote:

Hi @hazardoom 

If you arent wanting to go down the app route then another thing you could look at it is using the REST API. 

Ive used this with clients in the past, here is an example to get your started if this is something you wanted to explore:

import configparser
import requests
from urllib.parse import quote

# ======= USER CONFIGURATION =========
SPLUNK_HOST = 'https://yoursplunkhost:8089'  # e.g., https://localhost:8089
SPLUNK_TOKEN = 'your-token-here'  # just the token string, no 'Splunk ' prefix
APP = 'search'  # target app for the saved searches
CONF_FILE = 'savedsearches.conf'  # path to your savedsearches.conf file
VERIFY_SSL = True  # set to False if using self-signed certs
USERNAME = 'yourUsername'  # API requires this as a path component
# ====================================

# Map conf fields to REST API fields
def convert_field_name(field, value):
    """Map .conf fields to API fields and perform value translations."""
    if field == "enableSched":
        return "is_scheduled", "1" if value.strip().lower() in ("1", "true", "yes", "on") else "0"
    return field, value

def load_savedsearches(conf_path):
    cp = configparser.ConfigParser(strict=False, delimiters=['='])
    cp.optionxform = str  # preserve case and case sensitivity
    cp.read(conf_path)
    return cp

def upload_savedsearches(cp):
    headers = {'Authorization': f'Splunk {SPLUNK_TOKEN}'}
    base_url = f"{SPLUNK_HOST}/servicesNS/{USERNAME}/{APP}/saved/searches"

    for savedsearch_name in cp.sections():
        data = {'name': savedsearch_name}
        for field, value in cp[savedsearch_name].items():
            api_field, api_value = convert_field_name(field, value)
            data[api_field] = api_value

        search_url = f"{base_url}/{quote(savedsearch_name)}"
        
        # Check if the saved search exists (GET request)
        check = requests.get(search_url, headers=headers, verify=VERIFY_SSL)
        if check.status_code == 200:
            print(f"Updating existing savedsearch: {savedsearch_name}")
            r = requests.post(search_url, data=data, headers=headers, verify=VERIFY_SSL)
        else:
            print(f"Creating new savedsearch: {savedsearch_name}")
            r = requests.post(base_url, data=data, headers=headers, verify=VERIFY_SSL)

        if r.status_code not in (200, 201):
            print(f"Failed for {savedsearch_name}: {r.status_code} {r.text}")
        else:
            print(f"Success: {savedsearch_name}")

def main():
    cp = load_savedsearches(CONF_FILE)
    upload_savedsearches(cp)

if __name__ == "__main__":
    main()

We use this approach to upload file direct from Git pipelines which is especially useful if you arent an admin on the platform so cannot upload an app - however may also work well for your usecase. Note: you could use the Splunk Python SDK too, which basically does the same thing.

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

 


 

Get Updates on the Splunk Community!

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...