Dashboards & Visualizations

Why does loadjob savedsearch return artifacts from a failed job?

bhooker_axcient
Engager

I have a dashboard which loads the results of a saved search to speed up the load times.

The saved search is scheduled to run frequently, and keeps results from the past 7 or 8 searches in it's history.

Often times, the dashboard gets 0 results back from the loadjob command, and when I check the latest jobs, the most recent job status will be failed.  To fix it, I just manually delete the failed job and let it roll back to the previous Done job.

Documentation for loadjob has an ignore_running option, but I am not seeing a way to ignore failed jobs, which would be nice.

** As an aside, I have noticed that I can make the saved search fail if I repeatedly call loadjob from a search (hit search 7 or 8 times without letting it finish) while the job is running.  I suspect that something like this is happening due to dashboard loads while this job is running which causes the failed job.

Labels (1)

bhooker_axcient
Engager

On the dashboard, there were too many calls to loadjob with the savedsearch parameter.  I found that if a user loaded that page while the saved search had a job in progress, the job would fail.   

I corrected the problem by instead making the call once in the dashboard:

<search id="cached_results">
  <query>| loadjob savedsearch="my_job_name"</query>
  <done>
  <set token="tok_cached_result">$job.sid$</set>
  </done>
</search>

Then I can load those results each time I needed to reference them:

<search>
<query>| loadjob "$tok_cached_result$"</query>
...
</search>

I hope this helps someone!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...