All Apps and Add-ons

MLTK - Forecast Time Series: Dynamic Holdback Value?

jomulzer
Engager

Hi all,

I was wondering if it is possible to set the holdback used for the forecasting to a dynamic value (e.g. 1/10 of the events) somehow?
As the number of events used for prediction is increasing over time, I would like to have a relative and therefor automatically adapting holdback value instead of changing the absolute number of withholded events manually once in a while.

Any tipps or recommendations if and how this could be accomplished would be highly appreciated! 🙂

0 Karma
1 Solution

jomulzer
Engager

Did it! Thanks to @Sukisen1981 and with a little perseverance I found a way to manage it!
If anyone else is facing the problem, some parts of my XML code below:

To set the tokens: (for some reason, the future timespan includes the holdback - so the future timespan has to be dynamic as well.)

<search>
    <query>index=...  | stats count as x | eval x=round(x*0.25,0) | eval y=x+28</query>
    <earliest>1</earliest>
    <latest>now</latest>
    <done>
      <set token="holdback_tok">$result.x$</set>
      <set token="future_tok">$result.y$</set>
    </done>
  </search>

To use the tokens:

<panel>
      <title>New User Prediction</title>
      <viz type="Splunk_ML_Toolkit.ForecastViz">
        <search>
          <query>index=... | timechart span=1d count as NewUser | predict "NewUser" as prediction algorithm=LLP holdback=$holdback_tok$ future_timespan=$future_tok$ period=7 upper90=upper90 lower90=lower90 | eval prediction=round(prediction,0) | `forecastviz($future_tok$, $holdback_tok$, "NewUser", 90)`</query>
          <earliest>1</earliest>
          <latest>now</latest>
          <sampleRatio>1</sampleRatio>
        </search>
      </viz>
    </panel>

View solution in original post

jomulzer
Engager

Did it! Thanks to @Sukisen1981 and with a little perseverance I found a way to manage it!
If anyone else is facing the problem, some parts of my XML code below:

To set the tokens: (for some reason, the future timespan includes the holdback - so the future timespan has to be dynamic as well.)

<search>
    <query>index=...  | stats count as x | eval x=round(x*0.25,0) | eval y=x+28</query>
    <earliest>1</earliest>
    <latest>now</latest>
    <done>
      <set token="holdback_tok">$result.x$</set>
      <set token="future_tok">$result.y$</set>
    </done>
  </search>

To use the tokens:

<panel>
      <title>New User Prediction</title>
      <viz type="Splunk_ML_Toolkit.ForecastViz">
        <search>
          <query>index=... | timechart span=1d count as NewUser | predict "NewUser" as prediction algorithm=LLP holdback=$holdback_tok$ future_timespan=$future_tok$ period=7 upper90=upper90 lower90=lower90 | eval prediction=round(prediction,0) | `forecastviz($future_tok$, $holdback_tok$, "NewUser", 90)`</query>
          <earliest>1</earliest>
          <latest>now</latest>
          <sampleRatio>1</sampleRatio>
        </search>
      </viz>
    </panel>

Sukisen1981
Champion

great work @jomulzer , i had first used this approach in the cluster command, to pass the probability value (t) dynamically, based on a user selection and it works...was sure it would work for your use case as well...

0 Karma

Sukisen1981
Champion

assuming you ultimately want the panel in a dashboard, you can use tokens to pass the holdback value dynamically.
Of course the token has to be passed values based on some condition. Even if you see the underlying XML code in red, still go ahead and try using tokens, it should work

Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...