Alerting

Dynamic absolute time for alert search results

jacobjstewart
New Member

Background
We're currently running a Scheduled alert (pushing to Slack) with a simple Search query looking for "response=400", running every 5 minutes (cron'd)
- "Earliest" set to -5
- "Latest" set to Now
In the Alert body sent to Slack, we're returning the token URL (using $results_link$) to open up Splunk and dive in.

Problem
If we look at the Alert more than 5 minutes after the Alert is sent by clicking on the results link, there're no results displayed in Splunk (assuming no errors in the last 5 minutes). So as a workaround, we're currently adding "earliest=-24" to the Splunk query in the browser / entry field.

How can we preserve the date/time as to when the Search was run to have the results displayed in Splunk despite when the user clicked on the results link? Perhaps is there a way to pass the date/time from the result URL to be retrieved and/or some other mechanism configuration with date/time settings in the Search query? I looked at using time modifiers and didn't find anything suiting to this use case.

We have some issues with Real-Time alerting that we need to solve (internally), so ideally we can stay away from using that for the time being.

Thank you.

Tags (1)
0 Karma

elliotproebstel
Champion

You can add the command addinfo to your search to get the timespan over which the search was run. It'll be like this:

your current search
| addinfo

The fields you'll pass to the alert are info_min_time (the "earliest" time in the alert search) and info_max_time (the "latest" time in the alert search). These will be epoch times, so you can pass them around without worrying about relative time functions that are no longer relevant by the time you review them. Here's some guidance about addinfo:
https://docs.splunk.com/Documentation/Splunk/7.0.3/SearchReference/Addinfo

jacobjstewart
New Member

Thank you @elliotproebstel and @damien_chillet. I'm a beginner with splunk, so please bare with me. The URL currently returned does not have the earliest or latest stated (e.g. http://splunkus.instance/app/search/@go?sid=scheduler__username__search__RMD5465ce29d801b0ccd_at_152...)

@elliotproebstel
Ok, so no problem appending | addinfo to the search. I'm trying to piece together the second point to pass the info_min_time and info_max_time to the alert. Do I append those variables to $results_link$? So something like this $results_link.info_min_time.info_max_time$ . ?

0 Karma

elliotproebstel
Champion

As I posted in a comment above, it seems like the correct solution is probably in modifying the TTL for the alert, but even then - you'll be specifying some period of time during which the search artifacts will continue to live, and after that, they will be gone. If you want a bit of a workaround that's not super clean but will give your alert an indefinite lifetime, you could pass the tokens: $result.info_min_time$ and $result.info_max_time$ into the alert body. Those will come through as epoch strings. As @damien_chillet mentioned above, this will also mean that your search results will all have four fields that they didn't have before we started all this: info_min_time, info_max_time, info_sid (the search ID assigned by Splunk), and info_search_time (the time, in epoch value, at which the search was run). To use the epoch strings to re-run the search, you can either add them directly into the SPL (earliest=1524052800 latest=1524140780, for example) or you can paste them into the "Advanced" section in the time picker dropdown.

0 Karma

damien_chillet
Builder

Hi Jacob,

The link should not have earliest or latest in it, ideally it should refer to the existing job:
Something like:

https://splunk.instance/app/an_app/@go?sid=<job_id>;
0 Karma

damien_chillet
Builder

Wait, is it a real-time search you are talking about?

0 Karma

jacobjstewart
New Member

@damien_chillet, no not a real time search. The alert is based on a Schedule to run every 5 mins.

0 Karma

damien_chillet
Builder

Is the alert condition "Number of results > 0"?
If so the job results should be saved and clicking the link should display the results without having to run the job once more.

0 Karma

jacobjstewart
New Member

Yes, the alert condition is set to "Number of results > 0". The behaviour I'm seeing is - if the link is clicked just after the alert is fired (within 1 min say), the results are displayed in Splunk. However, if the link is clicked >= 5 minutes after the alert fired (assuming there hasn't been an exception found since), then the results returned in Splunk are empty as the Relative search is 5 minutes ago.

0 Karma

damien_chillet
Builder

Let's try with an example.
The search runs at 13:45 for events between 13:40 and 13:45.
It finds 1 error event so an alert link is sent to your slack channel.

Whatever the time is when you click the link it should load the job which ran at 13:45 for events between 13:40 and 13:45.

The time range would change only if you re-run the job manually.

0 Karma

jacobjstewart
New Member

Valid example, but I only see the results in Splunk if the link is opened just after 13:45 it seems, so I must be missing something if the time when you click the link shouldn't matter. Any other thoughts? Perhaps there's some back end configuration overriding the $results_link$ parameters to re-run the job? Based on what @elliotproebstel was saying, the I need to pass 'info_min_time' and 'info_max_time' into the Alert.

0 Karma

damien_chillet
Builder

Could you share the alert configuration settings such as expiration time?

0 Karma

damien_chillet
Builder

I could be missing something, but I can't see how addinfo would help here, it would just add fields to your existing search, not change time range the alerts run on.

0 Karma

elliotproebstel
Champion

My thinking in using addinfo was based on experience with drilldowns where I was able to directly pass the info_min_time and info_max_time through a drilldown to dynamically create a new search running in the same time window. It was meant as a workaround to allow the user to re-run the same search at an arbitrary time in the future, as requested.

That said, I honestly can't get that approach to the finish line. I can pass those time values through an alert but can't seem to assemble a full search URL with the tokens available to an alert.

This post from 2016 seems to suggest the solution will lie in editing the TTL for the alert:
https://answers.splunk.com/answers/440040/saving-alert-artifacts-for-longer-periods-of-time.html

Unfortunately, at this moment, I can't seem to get any pages from Splunk Docs to load, so I can't provide any updated official guidance.

0 Karma

jacobjstewart
New Member

@damien_chillet, @elliotproebstel

So in the default alert_actions, email has a default of 86400, but there IS an alert_actions in local, but without a TTL. Then, in the default saved searches, there's a dispatch.ttl of "2p".

* If the integer is followed by the letter 'p' Splunk interprets the ttl as a multiple of the scheduled search's execution period (e.g. if the search is scheduled to run hourly and ttl is set to 2p the ttl of the artifacts will be set to 2 hours).

Considering the above 2p, if the alert is scheduled for every 5 min then maybe the TTL is 10 min?

0 Karma

elliotproebstel
Champion

The saved results will expire shortly, since the alert is only looking back over 5 mins. I think the expected lifetime of a search like that is twice the interval over which it runs.

0 Karma

damien_chillet
Builder

Yea it is like that for jobs. But in case of a triggered alert it should be kept longer, i think expiration time is 24h by default for a triggered alert.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...