next up previous contents index
Next: Transfer Residual Errors to Up: Stability and Accuracy Assessments Previous: Stability and Accuracy Assessments   Contents   Index


Residual Vectors.

Let $(\wtd\alpha,\wtd\beta)$ denote a computed eigenvalue, and let $\wtd x$ be its corresponding computed eigenvector, i.e.,

\begin{displaymath}
\wtd\beta A \wtd x\approx \wtd\alpha B\wtd x.
\end{displaymath}

Sometimes, its corresponding left computed eigenvector $\wtd y$ is also available:

\begin{displaymath}
\wtd\beta \wtd y^{\ast} A\approx \wtd\alpha\wtd y^{\ast} B.
\end{displaymath}

The residual vectors corresponding to the computed eigenvalue $(\wtd\alpha,\wtd\beta)$ are defined as

\begin{displaymath}
r = \wtd\beta A \wtd x - \wtd\alpha B\wtd x \quad \mbox{and}...
...ast} = \wtd\beta \wtd y^{\ast} A - \wtd\alpha\wtd y^{\ast} B,
\end{displaymath}

respectively. For the simplicity, we normalize the approximate eigenvectors so that $\Vert\wtd x\Vert _2 = 1$ and $\Vert\wtd y\Vert _2 = 1$ and normalize the approximate eigenvalue so that $\vert\wtd\alpha\vert^2+\vert\wtd\beta\vert^2=1$.



Susan Blackford 2000-11-20