ConjGrad:

--------------------------------------------------------------------------
   Solves the least squares problem for

   J     = 0.5 rho'*W*rho + 0.5*(x-xA)'*S0*(x-xA)
   rho   = y - h(x)
   �h/�x = H(x)
   �J/�x = g = -H'*W*rho + S0*(x-xA)

   Uses the conjugate gradient method to solve the least squares
   problem 

   The next step is

   x(k+1) - x(k) = alpha*d(k)

   where

   d(k) = -g(k) + [(g(k)'(g(k)-g(k-1))/g(k-1)'g(k-1)]d(k-1)

   alpha is found by minimizing J with respect to alpha at
   each step

--------------------------------------------------------------------------
   Form:
   [x, k, P, wmr, sr, J, sig, nz] = ConjGrad( F, CF, S0, xA, kX, tol, prog, sd )
--------------------------------------------------------------------------

   ------
   Inputs
   ------
   F                      [rho,H,W,jL]  = F(xA)
   CF                     [J] = CF( alpha, x0, d, S0, xA )
   S0                     A priori state covariance matrix
   xA                     A priori state
   kX                     States to be found
   tol                    Cost tolerance
   prog                   If not = 0 give progress reports
   sd                     Use steepest descent

   -------
   Outputs
   -------
   x                      Matrix of state vectors
   k                      Number of iterations
   P                      Covariance matrix: inv[S0 + G'WG]
   wmr                    Weighted mean of the residuals
   sr                     Weighted rms deviation of the residuals
   J                      Loss estimate
   sig                    Uncertainty in the estimates
   nz                     Number of measurements used

--------------------------------------------------------------------------
	References:	Strang, G., Introduction to Applied Mathematics, Wellesley-
               Cambridge Press, 1986, pp. 378-379.
--------------------------------------------------------------------------

Children:

Common: General/IsVersionAfter