next up previous contents index
Next: Local Reorthogonalization and Detecting Up: Reorthogonalization Previous: Full Reorthogonalization.   Contents   Index

Selective Reorthogonalization.

As the size $j$ of the basis grows, full reorthogonalization will need substantial arithmetic work, and we look for a more economical scheme. It can be proved that most of the desired properties of the Lanczos algorithm are preserved as long as the basis is semiorthogonal, i.e., orthogonal to half the machine precision,
\begin{displaymath}
W_{j}=V^{\ast}_jV_j=I_{j}+E\,, \quad \Vert E\Vert < \sqrt{\eps}\,.
\end{displaymath} (32)

The reader is referred to the investigation by Simon [402] for a detailed discussion of these matters. It is proved that if the tridiagonal matrix $T$ is computed by a Lanczos algorithm with a semiorthogonal basis $V_j$ (4.18), then $T$ is a fully accurate projection of $A$ on the subspace spanned by the computed basis $V_j$,

\begin{displaymath}T_{j}=N_j^{\ast} AN_j+G\,, \quad \Vert G\Vert = O(\eps\Vert A\Vert)\,.\end{displaymath}

Here $N_j$ is an orthogonal basis of the subspace spanned by $V_j$. Even if $N_j$ is not known explicitly, we know that the eigenvalues of $T_j$ are accurate approximations to those of $A$.

One way of making sure that the computed basis is semiorthogonal is to use a recurrence derived by Paige to monitor how losses of orthogonality in earlier steps are propagated to later steps and to do a reorthogonalization when this recurrence indicates that a threshold of $\sqrt{\eps}$ is exceeded. The recurrence is between successive elements $\omega_{j,k}$ of the product matrix $W_{j}$ (4.18) and is derived from the basic recursion (4.10) in the following way.

First we note that the computed vector $v_{j+1}$ satisfies (4.10):

\begin{displaymath}\beta_jv_{j+1}=Av_j-\alpha_j v_j - \beta_{j-1}v_{j-1}+f_j,\end{displaymath}

where $f_j$ is a vector of rounding errors. To get elements $\omega_{j,k}$ of the product matrix $W_{j}$ (4.18), we premultiply this with $v_k^{\ast}$ and get

\begin{displaymath}\beta_j\omega_{j+1,k}=v_k^{\ast}Av_j-\alpha_j \omega_{j,k} - \beta_{j-1}\omega_{j-1,k}+v_k^{\ast}f_j\,.\end{displaymath}

This multiplication with indices $j$ and $k$ exchanged gives

\begin{displaymath}\beta_k\omega_{j,k+1}=v_j^{\ast}Av_k-\alpha_k \omega_{j,k} - \beta_{k-1}\omega_{j,k-1}+v_j^{\ast}f_k\,,\end{displaymath}

and subtracting the two gives
\begin{displaymath}
\beta_j\omega_{j+1,k}=\beta_k\omega_{j,k+1}+(\alpha_k-\alpha...
...j-1}\omega_{j-1,k}-v_j^{\ast}f_k+v_k^{\ast}f_j.
\hspace*{6pt}
\end{displaymath} (33)

Note that the terms involving $A$ cancel out since $v_k^{\ast}Av_j=v_j^{\ast}Av_k$ for a Hermitian matrix $A$.

We use this recursion to fill the lower triangular part of the symmetric matrix $W$ one row at a time. The diagonal elements $\omega_{k,k}=1$ and the elements closest above and below the diagonal $\omega_{k-1,k}=\omega_{k,k-1}=O(\eps)$ at the rounding error level, due to symmetry and the explicit orthogonalization. This gives starting values for the recursion (4.19). We estimate the absolute value of the sum of the two last terms in (4.19) by $ 2\eps\Vert A\Vert$, with an appropriate guess for the norm of $A$, and feed these values, that also are at rounding error level, into the recursion with the sign chosen so that the absolute value of the new $\vert\omega_{j+1,k}\vert$ is maximized,

\begin{displaymath}
\tilde{\omega}=\beta_k\omega_{j,k+1}+(\alpha_k-\alpha_j)\omega_{j,k}+
\beta_{k-1}\omega_{j,k-1}-\beta_{j-1}\omega_{j-1,k},\end{displaymath}


\begin{displaymath}\omega_{j+1,k}=(\tilde{\omega}+\mathrm{sign}(\tilde{\omega})2\eps\Vert A\Vert)/\beta_j.\end{displaymath}

As soon as one element $\omega_{j+1,k}$ exceeds the $\sqrt{\eps}$ sized threshold, we orthogonalize the two current vectors $v_{j}$ and $v_{j+1}$ to all the previous vectors $v_k\,,\, k=1,\dots,j$, and put the corresponding $\omega_{j,k}$ and $\omega_{j+1,k}$ down at the rounding error level. We need to orthogonalize two vectors because of the three-term recurrence in the Lanczos method, and as a consequence in the $\omega$ recursion (4.19). It is not necessary to store more than the two latest rows, $j-1$ and $j$, of the matrix $W$ of $\omega$ estimates.

The above is one of the techniques described in [402]. There is another method that uses explicit orthogonalization only to those vectors $v_k$ for which $\omega_{j+1,k}$ exceeds the threshold. This is what Simon [402] called partial reorthogonalization, but the variant described here, periodic reorthogonalization, which was first described by Grcar [204], is simpler to implement and can use Level 2 BLAS operations.


next up previous contents index
Next: Local Reorthogonalization and Detecting Up: Reorthogonalization Previous: Full Reorthogonalization.   Contents   Index
Susan Blackford 2000-11-20