Non-Quadratic Conjugate Gradient (NQCG)
The solver provided by Cressel Anderson implements Zangwill’s algorithm and used a Davies, Swann, Campey line search step [3]. Zangwill’s algorithm, based off of Powell’s algorithm finds the set of conjugate directions using a series of line searches. However, unlike Powell’s method Zangwill’s check for linear independence in the conjugate directions which it collects. A major advantage of this class of algorithm is that neither gradient nor Hessian information needs to be known.
Scipy’s Levenberg-Marquardt Least Squares
In this project, I have decided to use the Scipy package’s built in Levenberg Marquardt algorithm to solve the previously described optimization problem. The main benefit of it in this case is a substantial increase in execution speed with higher accuracy. This conclusion was made after running the NQCG algorithm until convergence against the scipy optimizer. Using 1000 data points on a 3 link manipulator it took 2933 seconds with the provided NQCG and 231 seconds with Levenberg-Marquardt. The objective function, i.e. residual error, was 7.15 for NQCG and 3.8E-13 for Levenberg-Marquardt.
SQP Solver
For the project I have also implemented a SQP solver that can handle both inequality and equality constraints based algorithm 15.4 in [3]. To solve the associated quadratic problem, I implemented the nonfeasible-initialization primal-dual path-following method (algorithm 13.3). This algorithm in turn depends on an implementation of golden section line search. Also, as algorithm 13.3 does not natively handle inequality constraints, the SQP subproblem involving inequality and equality constraints was reformulated as a pure equality constrained problem by adding slack variables.
[3] A. Antoniou, W. Lu. Practical Optimization. 2007.