Next: Lanczos Algorithm with SI.
Up: Lanczos Methods A.
Previous: Lanczos Methods A.
  Contents
  Index
In the direct iteration variant,
we multiply with and solve systems with ;
this corresponds to . It computes a
basis , where the matrix pencil is represented by a
real symmetric tridiagonal matrix,
|
(72) |
satisfying the basic recursion,
|
(73) |
Compare this to the standard Hermitian case (4.10).
Now the basis is -orthogonal,
|
(74) |
and the matrix is congruent to a section of ,
|
(75) |
We simplify the description of the -orthogonalization
by introducing an auxiliary basis,
|
(76) |
which is -orthogonal,
|
(77) |
and for which .
Precisely as in the standard case,
compute an eigensolution of ,
and get a
Ritz value
and a Ritz vector,
|
(78) |
approximating an eigenpair of the pencil (5.1).
Its residual,
is -orthogonal to the Krylov space spanned by .
We may estimate the norm of the residual as we did in
the standard Hermitian case, (4.13), but now
this is better done using the -norm getting
|
(79) |
using the fact that
.
It is natural to use the -norm when measuring convergence;
see [353, Chap. 15].
As in the standard case we need to
monitor the subdiagonal elements of ,
and the last elements of its eigenvectors. As soon as this
product is small, we may flag an eigenvalue as converged,
without actually performing the matrix-vector
multiplication (5.14). We save this
operation until the step when the estimate (5.15)
indicates convergence.
We get the following algorithm.
Let us comment on this algorithm step by step:
- (1)
- If a good guess for the wanted eigenvector is available, use it as the
starting . In other cases choose a random direction, for instance,
one consisting of normally distributed random numbers. Notice that
is the Rayleigh quotient of the
starting vector and that measures the -norm of its
residual (5.15).
- (4)
- This is where the large matrix comes in.
Any routine
that performs a matrix-vector multiplication can be used.
- (8)
- It is sufficient to reorthogonalize just one of the bases or ,
not both of them.
The choices are the same as in the standard case:
- Full:
Now we want to make the basis vectors -orthogonal,
computing
and repeat until the vector is orthogonal to the basis .
We have to apply one matrix-vector multiplication by
for each reorthogonalization, and we have to use the classical
variant of the Gram-Schmidt process.
We may avoid these extra multiplications with
if we save both bases and and subtract multiples of the
columns of ,
until orthogonality is obtained, almost always just once.
Now we may use a modified Gram-Schmidt orthogonalization.
- Selective: Reorthogonalize only when necessary,
one has to monitor the orthogonality as described in
§4.4.4
for the standard case. Note that now the symmetric matrix
is involved and some of the vectors
have to be replaced by the corresponding vectors .
- Local: Used for huge matrices, when it is difficult to store
the whole basis . Advisable only when one or two extreme eigenvalues
are sought. We make sure that is orthogonal to and
by subtracting
once
in this step.
- (9)
- Here we need to solve a system with the positive
definite matrix . This was not needed in the standard case
(4.1).
- (11)
- For each step , or at appropriate intervals,
compute the eigenvalues
and eigenvectors
of the symmetric tridiagonal matrix
(5.8). Same procedure as for the standard case.
- (12)
- The algorithm is stopped when a sufficiently
large basis has been found, so that eigenvalues
of
the tridiagonal matrix (5.8)
give good approximations to all
the eigenvalues of the pencil (5.1) sought.
The estimate (5.15) for the residual
may be too optimistic if the basis
is not fully -orthogonal.
Then the Ritz vector (5.14) may have its norm
smaller than , and we have to replace the estimate by,
- (14)
- The eigenvectors of the original matrix pencil (5.1) are computed only
when the test in step (5.4) has indicated that they
have converged. Then the basis is used in a matrix-vector
multiplication to get the
eigenvector (5.14),
for each that is flagged as converged.
Next: Lanczos Algorithm with SI.
Up: Lanczos Methods A.
Previous: Lanczos Methods A.
  Contents
  Index
Susan Blackford
2000-11-20