%%%%%% Nonlinear Equations with Analytic Jacobian %%%%%%%%%%%%%%%%%%%%%%%
%
% This example shows the use of the default medium-scale fsolve algorithm.
% It is intended for problems where
% 1. The system of nonlinear equations is square, i.e., the number of
% equations equals the number of unknowns.
% 2. There exists a solution x such that F(x)=0.
%
% The example uses "fsolve" to obtain the minimum of the Rosenbrock function
% by deriving and then solving an equivalent system of nonlinear equations.
% The Rosenbrock function, which has a minimum at F(x)=0, is a common test
% problem in optimization. It has a high degree of nonlinearity and
% converges extremely slowly if you try to use steepest descent type
% methods. It is given by:
% f(x)=100((x_2)-(x_1)^2)^2+(1-(x_1))^2
% First we generalize this function to an n-dimensional function,
% for any positive, even value of n:
% f(x)=SUM_i [100((x_2)-(x_1)^2]^2+[1-(x_1))^2]
% where i=1,...,n/2.
%
% Before we can use "fsolve" to find the values of x such that F(x),
% i.e., obtain the minimum of the generalized Rosenbrock function,
% we must rewrite the function as the following equivalent system of
% nonlinear equations:
% F(1)=1-(x_1)
% F(2)=10((x_2)-(x_1)^2)
% F(3)=1-(x_3)
% F(4)=10((x_4)-(x_3)^2)
% :
% :
% F(n-1)=1-(x_n)
% F(n)=10((x_n)-(x_n-1)^2
%
% This system is square, and we can use fsolve to solve it.
% As the example demonstrates, this system has a unique solution
% given by (x_i)=1, i=1,..,n.
%
% See function [F,J] = rosenbrockobj(x);
n = 64;
x0(1:n,1) = -1.9;
x0(2:2:n,1) = 2;
options=optimset('Display','iter','Jacobian','on');
[x,F,exitflag,output,JAC] = fsolve(@rosenbrockobj,x0,options);
% x(i)=-1.9 is used for the odd indices, and x(i)=2 for
% the even indices. We can try the fsolve default 'off' for the LargeScale
% option, and the default medium-scale nonlinear equation algorithm
% 'dogleg'. Then Jacobian to 'on' to use the Jacobian defined
% in rosenblockobj.m.
% The MathWorks, Matlab7. Copyright (c).