-
Couldn't load subscription status.
- Fork 132
Description
This is a continued version of mlpack/mlpack#1475, ported to the ensmallen repository because this is a more appropriate place. This was originally opened by @FloopCZ with the following text:
Hi everyone, first of all, thank you for the huge amount of work you have done on this project.
After a brief pass through the CMA-ES implementation, I found a few inconsistencies with the documentation.
The documentation of the Optimize function states that the last parameter (iterate) is also used as the starting point of the algorithm. However, this parameter seems to be basically ignored and the starting point is generated randomly here.
When the algorithm does not converge, it suggests to use a smaller step size here, yet there is no way to change the step size.
The lowerBound and upperBound variables don't seem to be true bounds for the variables. As far as I understand the code, those are only used to deduce the initial sigma value and the random starting point.
Have a nice day!
There are some responses in the original issue about how we should handle each point, but the issue is not yet solved.