Hi
I am using MLTK for anomaly detection. So I am benchmarking algorithms. I was wondering if it is possible to optimize the hyperparameters using techniques like: random search, grid search, bayesian optimization, etc.
I am not sure if it could be possible to apply it with real-time data. Has anyone done it before?
Thank you
Optimize the hyperparameters in an automated way is not something supported out of the box, but if your Splunk instance is a separate node and you won't impact production servers you can write your own optimization via the ML APIs by creating a clone of the algorithm and inserting the technique of your choice. Check out https://docs.splunk.com/Documentation/MLApp/4.2.0/API/Overview or the Github for more examples.
I would not try to do so with an |apply step, which you seem to be hinting at, but partial_fit might be an option.
Optimize the hyperparameters in an automated way is not something supported out of the box, but if your Splunk instance is a separate node and you won't impact production servers you can write your own optimization via the ML APIs by creating a clone of the algorithm and inserting the technique of your choice. Check out https://docs.splunk.com/Documentation/MLApp/4.2.0/API/Overview or the Github for more examples.
I would not try to do so with an |apply step, which you seem to be hinting at, but partial_fit might be an option.