Getting Data In

How to return job errors when searching via API?

nhaynie_tmo
Engager

When I call the Splunk API via Python SDK, I get results fine. However, when I run the same query via the UI, I sometimes have errors show up in the job inspector. Sometimes that error might suggest I may have partial results due to issues with an indexer. I would like to be able to get search results AND any errors that might be related to that job just as I do via the UI when querying the API. Does anyone know if this is possible and have any suggestions on how to retrieve them? Without returning errors, we may not be able to 'trust' the results if there were errors with the job we are not aware of.

Here is the code I am using to pull results:

def run_export(splunk_host, splunk_port, splunk_username, splunk_password, query, job_name, earliest_time, latest_time, directory, chunk):
    # Initialize service
    service = client.connect(host=splunk_host, port=splunk_port, username=splunk_username, password=splunk_password)

    kwargs_params = {"exec_mode": "normal",
                     "earliest_time":earliest_time,
                     "latest_time":latest_time,
                     "count":0}
    job = service.jobs.create(query, **kwargs_params)

    # A normal search returns the job's SID right away, so we need to poll for completion
    while True:
        while not job.is_ready():
            pass
        stats = {"isDone": job["isDone"],
                 "doneProgress": float(job["doneProgress"])*100,
                  "scanCount": int(job["scanCount"]),
                  "eventCount": int(job["eventCount"]),
                  "resultCount": int(job["resultCount"])}

        status = ("\r%(doneProgress)03.1f%%   %(scanCount)d scanned   "
                  "%(eventCount)d matched   %(resultCount)d results") % stats

        sys.stdout.write(status + "\n")
        sys.stdout.flush()
        if stats["isDone"] == "1":
            sys.stdout.write("\n\nDone!\n\n")
            break
        sleep(2)

    # Get the results and display them
    i=0
    for result in results.ResultsReader(job.results(count=0)):
        i=i+1
        print json.dumps(result)
        f = open(directory + '/' + job_name + '/' + chunk + '_' + str(i) + '_' + job_name + '.json','w')
        f.write(json.dumps(result))

    job.cancel()   
    sys.stdout.write('\n')

pmeyerson
Path Finder

I think you are asking for this:

http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTsearch#GET_search.2Fjobs.2F.7Bsearch_...

see: search/jobs/{search_id}/search.log

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to July Tech Talks, Office Hours, and Webinars!

What are Community Office Hours?Community Office Hours is an interactive 60-minute Zoom series where ...

Updated Data Type Articles, Anniversary Celebrations, and More on Splunk Lantern

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

A Prelude to .conf25: Your Guide to Splunk University

Heading to Boston this September for .conf25? Get a jumpstart by arriving a few days early for Splunk ...