nonlinear least squares of a multivariate function


[ Follow Ups ] [ Post Followup ] [ Netlib Discussion Forum ] [ FAQ ]

Posted by Michele Milano on September 24, 1998 at 21:39:00:

Hello,

I'm experiencing some convergence problems with lmder:
I would like to know, if someone had this problem before,
how to correctly tune the parameters, or a
suggestion on how to handle the problem in some other
way, if i'm misusing this subroutine somehow. the
problem is the following:

I have to do a least squares fitting to this model
(vectors are uppercase, scalars lowercase):

Y(i) = { yj[P,X(i)] }

where:
Y is the model output (a vector with n components)
yj[.] is the jth component of Y
j=1,...,n ranges on the n components of Y
i=1,...,m ranges on the m input/output pairs
P is the parameter vector
X is the model input

I use lmder in the following way:
I write the objective function to minimize as:

SUM(i=1,m SUM(j=1,n (dj(i) - yj[P,X(i)])^2 ) )

where dj(i) is the jth component of the ith observed
output vector D(i).

So, the total number of nonlinear functions to consider
is n*m, and the rows of the jacobian matrix are m*n as
well.

Now, my problem is that for one output component the
subroutine works fine, i.e. it always succeeds in
finding a reasonable minimum; when I increase the
number of outputs it stops very early, giving as information
that the tolerance on the parameters is too small.

If I start from this point and I continue the minimization
using the (in)famous numerical Recipes Levenberg-Marquardt
routine on the same function, it converges smoothly to
a reasonable point.

Now, since with one output I verified that the
performances of lmder are much better (a factor 10 or
so) w.r.t. the NR routine, what am I doing wrong here?

If there was some error in the computation of the
Jacobian, then I expected the NR routine to fail as
well, and anyway I checked the Jacobian routine
carefully, so I exclude such an eventuality...

Thanks in advance for any help,

Michele


Follow Ups: