Next: Error Bounds for Computed
Up: Positive Definite
Previous: Residual Vector.
  Contents
  Index
Transfer Residual Error to Backward Error.
It can be proved that there are Hermitian matrices
, e.g.,
![\begin{displaymath}
E = -r\wtd x^*-\wtd x r^*
+\left(\wtd x^*A\wtd x-\wtd\lambda \wtd x^*B\wtd x \right)
\wtd x\wtd x^*,
\end{displaymath}](img1633.png) |
(93) |
such that
and
are an exact eigenvalue and
its corresponding eigenvector of
.
We are interested in such matrices
with smallest possible norms.
It turns out the best possible
for the spectral norm
and the best possible
for the Frobenius norm
satisfy
![\begin{displaymath}
\Vert E_2\Vert _2=\Vert r\Vert _2, \quad
\Vert E_{F}\Vert _{...
...Vert^2_2
- (\wtd x^* A\wtd x-\wtd\lambda\wtd x^* B\wtd x)^2}.
\end{displaymath}](img1635.png) |
(94) |
See [256,431,473].
In fact,
is given explicitly by (5.29).
So if
is small, the computed
and
are exact ones of nearby matrices.
Error analysis of
this kind is called backward error analysis and
matrices
are backward errors.
We say an algorithm
that delivers an approximate eigenpair
is
-backward stable for the pair with
respect to the norm
if it is an exact eigenpair for
with
.
With these in mind,
statements can be made about the backward stability of the algorithm which
computes the eigenpair
.
In convention, an algorithm is called backward stable
if
, where
is the machine precision.
Next: Error Bounds for Computed
Up: Positive Definite
Previous: Residual Vector.
  Contents
  Index
Susan Blackford
2000-11-20