# Solve Nonlinear System Without and Including Jacobian

This example shows the reduction in function evaluations when you provide derivatives for a system of nonlinear equations. As explained in Writing Vector and Matrix Objective Functions, the Jacobian $J\left(x\right)$ of a system of equations $F\left(x\right)$ is ${J}_{ij}\left(x\right)=\frac{\partial {F}_{i}\left(x\right)}{\partial {x}_{j}}$. Provide this derivative as the second output of your objective function.

For example, the multirosenbrock function is an $n$-dimensional generalization of Rosenbrock's function (see Solve a Constrained Nonlinear Problem, Problem-Based) for any positive, even value of $n$:

$\begin{array}{l}F\left(1\right)=1-{x}_{1}\\ F\left(2\right)=10\left({x}_{2}-{x}_{1}^{2}\right)\\ F\left(3\right)=1-{x}_{3}\\ F\left(4\right)=10\left({x}_{4}-{x}_{3}^{2}\right)\\ ⋮\\ F\left(n-1\right)=1-{x}_{n-1}\\ F\left(n\right)=10\left({x}_{n}-{x}_{n-1}^{2}\right).\end{array}$

The solution of the equation system $F\left(x\right)=0$ is the point ${x}_{i}=1$, $i=1\dots n$.

For this objective function, all the Jacobian terms ${J}_{ij}\left(x\right)$ are zero except the terms where $i$ and $j$ differ by at most one. For odd values of $i, the nonzero terms are

$\begin{array}{l}{J}_{ii}\left(x\right)=-1\\ {J}_{\left(i+1\right)i}=-20{x}_{i}\\ {J}_{\left(i+1\right)\left(i+1\right)}=10.\end{array}$

The multirosenbrock helper function at the end of this example creates the objective function $F\left(x\right)$ and its Jacobian $J\left(x\right)$.

Solve the system of equations starting from the point ${x}_{i}=-1.9$ for odd values of $i, and ${x}_{i}=2$ for even values of $i$. Specify $n=64$.

n = 64;
x0(1:n,1) = -1.9;
x0(2:2:n,1) = 2;
[x,F,exitflag,output,JAC] = fsolve(@multirosenbrock,x0);
Equation solved.

fsolve completed because the vector of function values is near zero
as measured by the value of the function tolerance, and
the problem appears regular as measured by the gradient.

Examine the distance of the computed solution x from the true solution, and the number of function evaluations that fsolve takes to compute the solution.

disp(norm(x-ones(size(x))))
0
disp(output.funcCount)
1043

fsolve finds the solution, and takes over 1000 function evaluations to do so.

Solve the system of equations again, this time using the Jacobian. To do so, set the 'SpecifyObjectiveGradient' option to true.

[x2,F2,exitflag2,output2,JAC2] = fsolve(@multirosenbrock,x0,opts);
Equation solved.

fsolve completed because the vector of function values is near zero
as measured by the value of the function tolerance, and
the problem appears regular as measured by the gradient.

Again, examine the distance of the computed solution x2 from the true solution, and the number of function evaluations that fsolve takes to compute the solution.

disp(norm(x2-ones(size(x2))))
0
disp(output2.funcCount)
21

fsolve returns the same solution as the previous solution, but takes about 20 function evaluations to do so, rather than over 1000. In general, using the Jacobian can lower the number of function evaluations and provide increased robustness, although this example does not show improved robustness.

### Helper Function

This code creates the multirosenbrock helper function.

function [F,J] = multirosenbrock(x)
% Get the problem size
n = length(x);
if n == 0, error('Input vector, x, is empty.'); end
if mod(n,2) ~= 0
error('Input vector, x ,must have an even number of components.');
end
% Evaluate the vector function
odds  = 1:2:n;
evens = 2:2:n;
F = zeros(n,1);
F(odds,1)  = 1-x(odds);
F(evens,1) = 10.*(x(evens)-x(odds).^2);
% Evaluate the Jacobian matrix if nargout > 1
if nargout > 1
c = -ones(n/2,1);    C = sparse(odds,odds,c,n,n);
d = 10*ones(n/2,1);  D = sparse(evens,evens,d,n,n);
e = -20.*x(odds);    E = sparse(evens,odds,e,n,n);
J = C + D + E;
end
end