next up previous contents index
Next: Error Bounds for Computed Up: Stability and Accuracy Assessments Previous: Residual Vector.   Contents   Index


Transfer Residual Error to Backward Error.

There are Hermitian matrices $E$ such that $\wtd\lambda$ and $\wtd x$ are an exact eigenvalue and its corresponding eigenvector of $A+E$, i.e.,

\begin{displaymath}
(A+E)\wtd x=\wtd\lambda\wtd x.
\end{displaymath}

One such $E$ is
\begin{displaymath}
E = -r\wtd x^*-\wtd x r^* +\left(\wtd x^*A\wtd x -\wtd\lambda\right)
\wtd x\wtd x^*.
\end{displaymath} (61)

We are interested in such matrices $E$ with smallest possible norms. It turns out the best possible $E=E_2$ for the spectral norm $\Vert\cdot\Vert _2$ and the best possible $E=E_{F}$ for Frobenius norm $\Vert\cdot\Vert _{F}$ satisfy
\begin{displaymath}
\Vert E_2\Vert _2=\Vert r\Vert _2 \quad \mbox{and} \quad
\Ve...
...t r\Vert^2_2 - \left(\wtd x^* A\wtd x
-\wtd\lambda\right)^2};
\end{displaymath} (62)

see, e.g., [256,431]. In fact, $E_{\F}$ is given explicitly by (4.52). So if $\Vert r\Vert _2$ is small, the computed $\wtd\lambda$ and $\wtd x$ are an exact eigenpair of nearby matrices. Error analysis of this kind is called backward error analysis, and matrices $E$ are backward errors.

We say an algorithm that delivers an approximate eigenpair $(\wtd\lambda,\wtd x)$ is $\tau $-backward stable for the pair with respect to the norm $\Vert\cdot\Vert$ if it is an exact eigenpair for $A+E$ with $\Vert E\Vert\le\tau$. With this definition in mind, statements can be made about the numerical stability of the algorithm which computes the eigenpair $(\wtd\lambda,\wtd x)$. By convention, an algorithm is called backward stable if $\tau = O(\epsilon_M \Vert A\Vert)$, where $\epsilon_M$ is the machine precision.


next up previous contents index
Next: Error Bounds for Computed Up: Stability and Accuracy Assessments Previous: Residual Vector.   Contents   Index
Susan Blackford 2000-11-20