Reporting

Reports say first scheduled run not completed, but it has run many times on schedule

decoherence
Explorer

I have a scheduled report that sends an email which includes

  • a link to the report
  • a link to the results, and
  • the CSV

The link to the results and the attached CSV are fine and have the expected data. However, when a user clicks the link to the report, the report says "There are no results because the first scheduled run of the report has not completed."

Why might this be? Is it related to the job lifespan? This report runs monthly and I'd like the results to be available until the next run.

Thank you for any insight. I'm using 7.2.0.

0 Karma
1 Solution

decoherence
Explorer

I seem to have found a solution. From this page: https://docs.splunk.com/Documentation/Splunk/latest/Admin/Savedsearchesconf:

  • If an action is triggered, the ttl is changed to the ttl for the action. If multiple actions are triggered, the action with the largest ttl is applied to the artifacts. To set the ttl for an action, refer to the alert_actions.conf.spec file.

As I mentioned, these reports have a 'send email' action. This lead me to go to Settings > Searches, reports and alarms > Find the report in the list > Edit > Advanced Edit, finding the ttl for the 'Send email' action (action.email.ttl) and adjusting it to what I want. Now when the report runs, the search's expiry is what it should be.

I'm surprised I didn't find someone else doing this, as it must be a common issue. It would be great if someone could confirm for me whether this is standard operating procedure for Splunk 7.2.

View solution in original post

decoherence
Explorer

I seem to have found a solution. From this page: https://docs.splunk.com/Documentation/Splunk/latest/Admin/Savedsearchesconf:

  • If an action is triggered, the ttl is changed to the ttl for the action. If multiple actions are triggered, the action with the largest ttl is applied to the artifacts. To set the ttl for an action, refer to the alert_actions.conf.spec file.

As I mentioned, these reports have a 'send email' action. This lead me to go to Settings > Searches, reports and alarms > Find the report in the list > Edit > Advanced Edit, finding the ttl for the 'Send email' action (action.email.ttl) and adjusting it to what I want. Now when the report runs, the search's expiry is what it should be.

I'm surprised I didn't find someone else doing this, as it must be a common issue. It would be great if someone could confirm for me whether this is standard operating procedure for Splunk 7.2.

lespinosas
Explorer

Hi!

Did you ever experience this message from clicking the "View results in Splunk" link included in the email?

lespinosas_0-1733427681576.png

I was trying to edit the dispatch.ttl to make the search life a little bit large, but did not succeed. 

I was wondering if the action.email.ttl is the one for this issue.

Regards 🙂

0 Karma

decoherence
Explorer

https://docs.splunk.com/Documentation/Splunk/7.2.0/Search/Extendjoblifetimes

On this page it says

Default lifetimes for scheduled searches

Scheduled searches launch search jobs on a regular interval. By default, these jobs are retained for the interval of the scheduled search multiplied by two. For example, if the search runs every 6 hours, the resulting jobs expire in 12 hours.

This seems to imply there is something strange going on with my setup. I have a weekly scheduled report that completed at Jan 13, 2020 3:00:14 AM but the expiry for the job is Jan 15, 2020 9:16:00 AM. That's 54 hours. Where did THAT value come from? Should it not be two weeks, according to the above?

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...