Hi @koshyk I did a PoC with a customer around this a few months ago and we had great success using the regular saved/search endpoint to create searches from a custom app created with contentctl. ...
See more...
Hi @koshyk I did a PoC with a customer around this a few months ago and we had great success using the regular saved/search endpoint to create searches from a custom app created with contentctl. We used a custom app which we uploaded the searches into, this way we are following the same pattern as creating a custom app from contentctl which would be uploaded. ES rules are essentially scheduled searches, with modular alerts, they just show in ES because of some of the settings in them, from memory these might be action.correlationsearch.* and action.notable.param.* settings. I think the main gotcha that caught is out is that in the savedsearches.conf we use "enabledSched = 1" to schedule a search, however the API expects "is_scheduled =1" (WHY?!?) Check my response below to the following post which has a sample python code to upload a savedsearch if it helps - https://community.splunk.com/t5/All-Apps-and-Add-ons/Custom-app-in-cloud/m-p/745079 @livehybrid wrote: Hi @hazardoom If you arent wanting to go down the app route then another thing you could look at it is using the REST API. Ive used this with clients in the past, here is an example to get your started if this is something you wanted to explore: import configparser
import requests
from urllib.parse import quote
# ======= USER CONFIGURATION =========
SPLUNK_HOST = 'https://yoursplunkhost:8089' # e.g., https://localhost:8089
SPLUNK_TOKEN = 'your-token-here' # just the token string, no 'Splunk ' prefix
APP = 'search' # target app for the saved searches
CONF_FILE = 'savedsearches.conf' # path to your savedsearches.conf file
VERIFY_SSL = True # set to False if using self-signed certs
USERNAME = 'yourUsername' # API requires this as a path component
# ====================================
# Map conf fields to REST API fields
def convert_field_name(field, value):
"""Map .conf fields to API fields and perform value translations."""
if field == "enableSched":
return "is_scheduled", "1" if value.strip().lower() in ("1", "true", "yes", "on") else "0"
return field, value
def load_savedsearches(conf_path):
cp = configparser.ConfigParser(strict=False, delimiters=['='])
cp.optionxform = str # preserve case and case sensitivity
cp.read(conf_path)
return cp
def upload_savedsearches(cp):
headers = {'Authorization': f'Splunk {SPLUNK_TOKEN}'}
base_url = f"{SPLUNK_HOST}/servicesNS/{USERNAME}/{APP}/saved/searches"
for savedsearch_name in cp.sections():
data = {'name': savedsearch_name}
for field, value in cp[savedsearch_name].items():
api_field, api_value = convert_field_name(field, value)
data[api_field] = api_value
search_url = f"{base_url}/{quote(savedsearch_name)}"
# Check if the saved search exists (GET request)
check = requests.get(search_url, headers=headers, verify=VERIFY_SSL)
if check.status_code == 200:
print(f"Updating existing savedsearch: {savedsearch_name}")
r = requests.post(search_url, data=data, headers=headers, verify=VERIFY_SSL)
else:
print(f"Creating new savedsearch: {savedsearch_name}")
r = requests.post(base_url, data=data, headers=headers, verify=VERIFY_SSL)
if r.status_code not in (200, 201):
print(f"Failed for {savedsearch_name}: {r.status_code} {r.text}")
else:
print(f"Success: {savedsearch_name}")
def main():
cp = load_savedsearches(CONF_FILE)
upload_savedsearches(cp)
if __name__ == "__main__":
main() We use this approach to upload file direct from Git pipelines which is especially useful if you arent an admin on the platform so cannot upload an app - however may also work well for your usecase. Note: you could use the Splunk Python SDK too, which basically does the same thing. Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing