Posted by Scott Betts on September 03, 1997 at 10:55:46:
In Reply to: analytical eigenvalue problem posted by Patrick van der Smagt on September 03, 1997 at 06:12:05:
: I keep hoping that someone knows some paper, book,
: or other reference which may help me further here.
: I have a matrix
: Q = sum x x + x y + y x + y y
: ij p i j i j i j i j
: where each x and y also has an index p (not shown).
: Furthermore, for all k, E{x_k} = E{y_k} = 0.
: Therefore, Q consists of two correlation matrices
: which are each other's transposes, added to two
: covariance matrices. By summing three symmetric
: pos.def. matrices, Q itself is s.p.d.
: With the courant-fischer minimax theorem I can find
: a lower bound for the eigenvalues of Q, namely:
: they are (assuming that |x_k| = 1) of order 1
: (what, however, if |x_k| <= 1, or unbounded??).
: But..... how do I find an upper bound???
: Basically I want to know if Q is better conditioned
: through the addition of the xx, xy, and yx terms.
: Second, related problem: if I take a matrix
: Q' = sum y y
: ij p i j
: where E{y_k} != 0 (summed over p), this matrix appears
: to be very badly conditioned (in my case). Does the
: condition improve when I center the y's? Does the
: condition improve by adding the terms xy, yx, and xx?
Patrick,
I couldn't find any software that would attack
a problem as specific as that. If you are looking to
solve just that one problem though, you might want
to post the question to the Usenet news group
sci.math.num-analysis. There may be someone there
who could point you in the right direction.
Hope this helps!