Getting Data In

How to POST Saved Search XML/JSON Definition to REST API


I can GET the definition of a saved search (report) from our dev server with a call like

curl -k -u me:word https://splunk-for-dev:8089/serviceNS/me/my-app/saved/searches/my-report

How do I use the resulting XML/JSON to POST to our prod server? The closest that I've found is something like

curl -k -u me:word https://splunk-for-prod:8089/serviceNS/me/my-app/saved/searches \
    -d name=my-report -d search=...

But that means going through the XML/JSON and working out which are the non-default values and a whole lot of text munging. Surely there is a way that I can just post the XML/JSON that I've already got?


Sometimes back I did like below, I basically had to write a python code to achieve this, I am posting a sample code,
Please note that its a very primitive code where exception handling is not proper and code modularise is not there. Let me briefly explain what it does,

  1. I have a alert in my local called "demo"
  2. First I am getting the alert json payload using python request module (I am able to convert the cURL request to python here).
  3. Now here the drama started while I am posting it to splunk. First of all I am not able to convert the cURL post request to python, even sending the data through dictionary also not working. So I directly called cURL in python. Now the second challange was the payload. Only the content payload itself so heavy that if I pass it to curl its not able to handle. Thats why I thought I will only send those data which user is able to change from UI. Thats why key_list comes into picture. The list is not exhaustive.
  4. Now the code is pusing the "demo" alert payload to a new alert called "sid". Thats why I have changed the name(hardcoded). So in the post request we need to give the new server url.
  5. I haven't handle the scenario where if the alert exists it should update otherwise it should create a new alert. I think that should be easy to implement.

    import requests as req
    import json
    import shlex
    import subprocess
    def get_alert_dtl_frm_splunk(requestURL,parameters,auth):
        if response.status_code !=200:
            print('Status: ',response.status_code,'Headers: ',response.headers,'Error Response: ',response.json())
        return json.dumps(data)
    def main():
        #preparing the data for get request
        requestURL = 'https://localhost:8089/servicesNS/admin/tmdb/saved/searches/demo'
        params = (('output_mode', 'json'),)
        auth=('admin', 'monitor!')
        #get the alert json from one splunk instance
        data = get_alert_dtl_frm_splunk(requestURL,params,auth)
        data = json.loads(data)
        alert_content_json = data["entry"][0]["content"]
        #post to splunk
        alert_content_json["name"] = "Sid"
        cmd ='curl -k -u admin:monitor! https://localhost:8089/servicesNS/admin/tmdb/saved/searches';
        cmd = cmd + " --data-urlencode name=" + alert_content_json["name"] #as first argument of curl has to be name
        key_list = ["alert.severity","alert.suppress","alert.track","alert_type","cron_schedule","is_scheduled","alert_threshold","alert_comparator","search"] #need to do this as cUrl command is not able to send the full payload
        for key in alert_content_json.keys():
            if key in key_list:
                value = str(alert_content_json[key])
                value = value.replace("\"","\"\"")
                value = value.replace("\\n","\\")
                cmd = cmd + " --data-urlencode "  + key +  '="'  + value + "\"" 
        args = shlex.split(cmd)
        process =, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
        stdout, stderr = process.communicate()
0 Karma


@Rikh Did you got this problem solved? If yes can you please post a solution here .

0 Karma

New Member

Do you know how to write an API call to get the result in UI?

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...