Dec 4, 2012 at 10:38 PM
Edited Dec 4, 2012 at 10:43 PM

It is not clear in the documentation how FunctionMath.FindMinimum produces its result is it using the Newton method? Or the Levenberg–Marquardt
algorithm? Some other algorithm? Thanks for your help.


Dec 5, 2012 at 10:52 PM
Edited Dec 5, 2012 at 11:04 PM

Currently, the 1D overload uses Brent's method (http://www.amazon.com/dp/0486419983) and the multidimensional overload uses Powell's conjugate gradient method (http://en.wikipedia.org/wiki/Powell%27s_method).
While we don't exactly want to hide these implementation details from you, we don't want you to rely on them either. In other words, FindMinimum promises to be the best minimization algorithm we have had the chance to implement, not any particular algorithm.
In the future, for example, we plan to replace the current multidimensional implementation with one that uses a modeltrust approach (http://www.appliedmathematics.net/optimization/optimizationIntro.html).
The LevenbergMarquardt algorithm, by the way, requires derivatives. Since we wanted to provide a minimization function that does not require the user to supply derivatives, we didn't consider it. (It's possible to internally compute derivatives numerically,
but doing so tends to be slower than using a method that was designed to be derivativefree.)



Ok, thanks for the info!



So is the minimization algorithm supposed to handle strict barriers well (as in, it tries a value or values outside a certain range, and gets as its cost double.MaxValue or somesuch), as well as more general piecewise functions?



The onedimensional version of FindMinimum has an overload that takes an Interval. The minimum is found within that interval, and the actual interval endpoints are never evaluated.
The multidimensional version of FindMinimum currently does not accept constraints. It will follow wherever in R^N the algorithm points it, starting from the initial guess it is given. One reason we would like to move to a modeltrust algorithm is that such
algorithms can incorporate constraints straightforwardly, whereas the current multidimensional algorithm cannot.


Feb 11, 2013 at 4:57 PM
Edited Feb 19, 2013 at 6:09 PM

Sounds good. Thanks for your response. I'm not sure if it would be helpful, but I have used this version of Powell's method in the recent past to great effect:
https://github.com/cureos/csbobyqa
Perhaps you could suggest to the author of that code that they could merge it in with yours? That would allow for constraints and a lot of other options in the minimization.



Sorry, the link wasn't working properly there. Hope it helps!



This looks extremely promising. Thanks for the pointer (and thanks to cureos himself, who has made helpful suggestions for Meta.Numerics on multiple occasions.)



Ichbin, I did not want to "barge in" to this discussion before you had a chance to look at
csbobyqa, but I have actually made a first attempt of incorporating this solver into my
"portable" (.NET, Windows Store apps, Windows Phone and Silverlight) copy of Meta Numerics. Please have a look at
this changeset. If you find this changeset a useful starter, please feel free to grab it and incorporate it in your main library. If you need help in adapting the code further, do not hesitate to ask :)
Best regards,
Anders @ Cureos



Did anything happen to this? @cureos: It seems like you removed the csbobyqa code from your version of meta numerics again?



A version of Powell's BOBYQA algorithm is checked in and will be part of the release currently being prepared. On typical functions, it requires 2x  5x fewer evaluations that the old algorithm, so we are very pleased with it.
The one aspect of the algorithm that is missing from the current code is the ability to set constraints. We know how to add it, but we weren't willing to push back this release any further than we already had in order to implement it. We do intend to add constraints
to the next release.
We have also added a global optimization algorithm which does support constraints. So a good workaround for a while will be to put the function into the global optimizer with constraints and demand a lowprecision solution. Then put that solution as a starting
point into the local optimizer without constraints and demand a highprecision solution.
In the current tree, the relevant methods are MultiFunctionMath.FindLocalMinimum and MultiFunctionMath.FindGlobalMinimum.



@roederja To be as true as possible to the original Meta Numerics library, I removed BOBYQA from my Portable Class Library version of Meta Numerics at
Github. The most recent version of BOBYQA for .NET, along with COBYLA and LINCOA, is available in my own library
csnumerics.
@ichbin Browsing through the source code here on CodePlex, I cannot immediately track down
MultiFunctionMath.FindLocalMinimum . In which source file should this method be located? According to the latest commit, you are using/planning to use UOBYQA, which only solves entirely unconstrained problems, whereas BOBYQA solves bound constrained
problems. Is the commit message correct, or are you actually incorporating BOBYQA in Meta Numerics? If you are going for UOBYQA, please note that Michael Powell developed a successor to UOBYQA named
NEWUOA, which generally should be a better choice performancewise.

