Next: Overview of Available Algorithms.
Up: Singular Value Decomposition
Previous: Singular Value Decomposition
  Contents
  Index
Introduction
In this chapter we consider the singular value decomposition (SVD)
of the by matrix . We assume without loss of generality
that ; if consider .
As described in §2.4, this decomposition
may be written
|
(108) |
where
is an by unitary matrix,
is an by unitary matrix,
and is an by diagonal matrix with
entries
for .
The are the left singular vectors,
the are the right singular vectors,
and the are the singular values.
The singular values are nonnegative and sorted so that
.
The number of nonzero singular values is the rank of .
If is real, and are real and orthogonal.
may also be written or
for .
may also be written
or
for
and for
.
There are several ``smaller'' versions of the SVD that are
often computed.
Let
be an by matrix
of the first left singular vectors,
be an by matrix
of the first right singular vectors,
and
be a
by matrix of the first singular values.
Then we have the following SVD types.
- Thin SVD.
-
is the thin (or economy-sized) SVD of .
The thin SVD is much smaller to store and faster to compute than
the full SVD when .
- Compact SVD.
-
is a compact SVD of .
The compact SVD is much smaller to store and faster to compute than
the thin SVD when .
- Truncated SVD.
-
is the rank- truncated (or partial) SVD of ,
where .
Among all rank- matrices , is the unique
minimizer of
and also minimizes (perhaps not uniquely)
.
The truncated SVD is much smaller to store and cheaper to compute than
the compact SVD when and is the most common
form of the SVD computed in applications.
The thin SVD may also be written
.
Each
is called a singular triplet.
The compact and truncated SVDs may be written similarly
(the sum going from to , or to , respectively).
The SVD of is closely related to the eigendecompositions of
three related Hermitian matrices, , , and
,
which were described in §2.4.7.
Most iterative algorithms for the SVD amount to applying an algorithm
from Chapter 4
to one of these Hermitian matrices, so we review and expand
that material here.
The choice of , , or depends on which singular
values and vectors one is interested in computing.
The cost of some algorithms, like shift-and-invert (see below),
may different significantly for , , and .
- Consider , which is an by Hermitian matrix.
Then the eigendecomposition of
,
where
is the SVD of .
Note that
.
In other words, the eigenvectors of are the right singular vectors
of ,
and the eigenvalues of are the squares of the singular values of .
Thus, if we find the eigenvalues and eigenvectors
of , then we can recover the compact SVD of
by taking
for ,
the right singular vectors as for ,
and the first left singular vectors by
.
The left singular vectors
through are not determined directly, but may be taken
as any orthonormal vectors also orthogonal to through .
When
is very small, i.e.,
,
then will not be determined very accurately.
- Consider , which is an by Hermitian matrix.
Then the eigendecomposition of
.
Note that
,
with zeros after .
In other words,
the eigenvectors of are the left singular vectors of ,
and the eigenvalues of are the squares of the singular values of ,
in addition to 0 with additional multiplicity .
Thus, if we find the eigenvalues and eigenvectors
of , then we can recover the compact SVD of by taking
for , the left singular vectors
as for , and the
first right singular vectors as
.
The right singular vectors through are
not determined directly, but may be taken as any orthonormal
vectors also orthogonal to through .
When
is very small, i.e.,
,
then will not be determined very accurately.
- Consider
,
which is an by Hermitian matrix.
Write
and
.
Then the eigendecomposition of
, where
and
|
(109) |
In other words, is an eigenvalue with unit eigenvector
for to ,
and 0 is an eigenvalue with eigenvector
for . Thus we can extract the singular values, left singular vectors,
and right singular vectors of directly from the eigenvalues and
eigenvectors of . More precisely, we can directly extract the compact
SVD from the eigenvalues and eigenvectors of .
But if 0 is a singular value of , then will be (at least) a double
eigenvalue of , so
will not necessarily be of the form (6.2).
(For example, if so that too, then can be
an arbitrary orthogonal matrix not necessarily of the form
(6.2).)
In this case, suppose the columns of
form an orthonormal basis spanning the eigenspace for the
zero eigenvalue of ;
then the right singular vectors
can be taken as any orthonormal vectors spanning
,
and the left singular vectors
can be taken as any orthonormal vectors spanning
.
We note that
so that two multiplications by are equivalent to multiplication
by both and .
The correspondence between the SVD of and the eigendecomposition
of shows that the discussion of perturbation theory for
eigenvalues and eigenvectors of Hermitian matrices in
§4.1 and
§4.8 applies directly to the SVD, as we now describe.
Perturbing to can change the singular values by
at most :
|
(110) |
Now suppose
is an approximation
of the singular triplet
, where
and are unit vectors. The ``best''
corresponding to and is the Rayleigh quotient
, so we assume that
has this value. Suppose is closer
to than any other , and let be the
gap
between and any other singular value:
.
Define the residual vector
as
If then
is an exact
singular triplet, and our error bounds will be small when
is small.
The difference between and is
bounded by
Furthermore, the angles
and
between the approximate and true singular
vectors are bounded by
In other words, the larger the gap is between the approximate
singular value and all the other ,
and the smaller the residual , then
the
tighter one can bound the error in , ,
and .
is easy to compute given , , and ,
but requires more information about the singular values
in order to approximate it. Typically one uses the computed
singular values near to approximate :
,
where
is the next computed singular value smaller than
, and
is the next computed
singular value larger than .
Subsections
Next: Overview of Available Algorithms.
Up: Singular Value Decomposition
Previous: Singular Value Decomposition
  Contents
  Index
Susan Blackford
2000-11-20