
Are there any plans to include Automatic Differentiation to the library?
It would really be nice and help developers prototype many numeric algorithms, such as Gradient Descent.
I already have a library that supports some basic Automatic Differentiation that I developed for a project that I am working on. I would be glad to contribute it to this project.



Yes, there are plans. (See bug #4420.) If you look in FunctionMath_Differentiate.cs, you will see that we have an implementation (at least for functions of a single variable), but it was pulled from the release 1.5.0 at the last moment because the error
esimates it was returning were unreliable. I would love to take a look at your implementation and see if it can solve our error estimate problems. Thanks for offering!
(If by automatic differentiation you meant symbolic differentiation rather than numeric differentiation, then I've misunderstood.)



It's something in the middle between symbolic and numeric differentiation. The term "Symbolic Differentiation" usually means "given a symbolic representation of a function  find a symbolic representation of its gradient".
The term "Automatic Differentiation" means "given a symbolic representation of a function and an explicit point  find its gradient at that point". This is what is usually needed for optimization algorithms, it can be done in a much easier
and faster way and it preserves most of the power of symbolic differentiation in terms of numeric accuracy.
Here is an example of what I mean. I have a library that does something similar and I can send you its source. You may find it useful.
// define three variables
var x = new Variable();
var y = new Variable();
var z = new Variable();
// define a symbolic function. Heavy use of operator overloading.
var symbolicFunc = 2*x + y*SymbMath.Cos(z);
// differentiate the function at x = 1, y = 2, z = 3
Variable[] vars = {x, y, z};
double[] pnt = {1, 2, 3};
double[] gradient = Differentiator.Differentiate(symbolicFunc, vars, pnt);
// do something with the gradient.

