All Apps and Add-ons

How do I capture dbquery errors in python sdk

yelkey
Explorer

Hi,

I am trying to dispatch my savedsearch with dbquery and dboutput commands using rest api in a python script as given below and extracting the status such as count, isFailed, results, isDone etc.

  • Python Script

kwargs_params={"key1":"value1","key2":"value2"}
job = mysavedsearch.dispatch(**kwargs_params)

while True:
job.refresh()
stats = {"isDone": job["isDone"],
"isFailed":int(job["isFailed"]),
"doneProgress": float(job["doneProgress"])*100,
"scanCount": int(job["scanCount"]),
"eventCount": int(job["eventCount"]),
"resultCount": int(job["resultCount"])}
status = ("\r%(doneProgress)03.1f%% %(scanCount)d scanned "
"%(eventCount)d matched %(resultCount)d results %(isFailed)d failedstatus ") % stats

  • Actual query

index=ABC|mysearch| table a, b, c, d, e
|dboutput database=XXX type=sql "INSERT INTO xyz
(v,w,x,y,z)
VALUES
($$a$$, $$b$$, $$c$$, $$d$$, $$e$$)"

But I am not able to capture the error caused by the db statements. As an example,I am not able to capture dboutput error such as "unique constraints violated". When I look at the job log under "inspect job" in splunk GUI, the error is part of the raw log and not the "search job properties" which is returned by the rest api endpoints. is there a way to capture such errors?
Appreciate all the help! Thanks!

Tags (3)
0 Karma
1 Solution

yelkey
Explorer

Hi,

Found a fix. In addition to the stats mentioned above(isFailed, isDone, resultCount etc..), I was able to retrieve another key "messages",
stats={"messages":job["messages"]} which returns the error messages.

Thanks all!

View solution in original post

0 Karma

yelkey
Explorer

Hi,

Found a fix. In addition to the stats mentioned above(isFailed, isDone, resultCount etc..), I was able to retrieve another key "messages",
stats={"messages":job["messages"]} which returns the error messages.

Thanks all!

0 Karma
Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...