MATLAB Answers

What is the reason that a Hessian should be positive definite? And what is the relation between an ill-conditioned Hessian and convergence?

5 views (last 30 days)
S.
S. on 23 Mar 2020
Commented: S. on 23 Mar 2020
What is exactly the reason that a Hessian should be positive definite?
Furthermore, if there are polyhedron input and state constraints, the constraints describe a space ''the polyhedron'' where the algorithm should search for a solution.
If you're Hessian is then ill-conditioned and converges therefore slowly, the ''interior-point'' algorithm cannot find a solution in this polyhedron and the algorithm jumps around in the solution space. Why does it then exactly jump around?

  0 Comments

Sign in to comment.

Answers (1)

John D'Errico
John D'Errico on 23 Mar 2020
Edited: John D'Errico on 23 Mar 2020
(This is not really a MATLAB question, but you are asking a question about the behavior of MTLAB solvers. Therefore close enough.)
There is no reason that a hessian MUST be positive definite, except at the solution. In fact, in general, the local Hessian may not be pos def, IF you computed it directly. For example, suppose you started the solver near a local max? Then the hessian there would in fact be neg definiite LOCALLY. But the approximation to the Hessian would still be positive definite.
So at a local minimizer of a function, the Hessian must be pos def. At other locations, the true hessian need not be so. However, if you are using a method that updates estimates of the Hessian, then the updates will ensure that the sequences of Hessian approximations are in general pos def. (I seem to recall that.) You can prove that for a line search method (again, as I recall from long ago) that as long as the search direction is a descent direction for methods like BFGS, etc., will always maintain pos def-ness of the Hessian.
In the case where the Hessian estimates are nearly singular during the iterations, then the solver can jump around because the system of equations it solves to choose a new search step are not well posed. So you get garbage,

  1 Comment

S.
S. on 23 Mar 2020
However, why does quadprog ask for a Hessian, which is positive definite then? I am using quadprog with the ''interior-point-convex'' algorithm to solve a convex QP problem, so the algorithm is searching for a global minimum. Should at this global minimum the Hessian be positive definite?
However, if you are using a method that updates estimates of the Hessian, then the updates will ensure that the sequences of Hessian approximations are in general pos def.
Why is that the case?
In the case where the Hessian estimates are nearly singular during the iterations, then the solver can jump around because the system of equations it solves to choose a new search step are not well posed. So you get garbage.
Which system of equations exactly and what is the reason that they are not well posed.
Additional question:
Furthermore, It seems that quadprog, the (conventional) interior-point algorithm, is not exploiting the sparsity and structure of the sparse QP formulation. Do you have a reason for that?
In MPC, the computational complexity should scale linerarly with the prediction horizon N. However, results show that the complexity scales quadratically with the prediction horizon N.

Sign in to comment.

Sign in to answer this question.