An unfortunate aspect of the Lanczos process is that there is no way to determine in advance how many steps will be needed to determine the eigenvalues of interest within a specified accuracy. It is determined by the distribution of the eigenvalues and the choice of starting vector . In many cases, convergence will not occur until gets very large. Maintaining the numerical orthogonality of quickly becomes intractable for large at a cost of floating point operations.
There is another way to maintain orthogonality: one may limit the size of the basis set and use restarting schemes. Restart means replacing the starting vector with an ``improved" starting vector and computing a new Lanczos factorization with the new vector. The structure of the residual vector serves as a guide: In theory, will vanish if is a linear combination of eigenvectors of . Our goal is to make this happen iteratively. In the symmetric case, these eigenvectors are orthogonal and hence form an orthonormal basis for an invariant subspace.
The need for restart was recognized early on by Karush  soon after the appearance of the original algorithm of Lanczos . Subsequently, there were developments by Paige , Cullum and Donath , and Golub and Underwood . All of these schemes are explicit in the sense that a new starting vector is produced by some process and an entirely new Lanczos factorization is constructed.
In this section we will describe implicit restart. It is a technique to combine the implicitly shifted QR scheme with a -step Lanczos factorization and obtain a truncated form of the implicitly shifted QR iteration. The numerical difficulties and storage problems normally associated with the Lanczos process are avoided. The algorithm is capable of computing a few () eigenvalues with user-specified features such as the algebraically largest or smallest eigenvalues or those of largest magnitude, using storage for only a moderate multiple of vectors. The computed eigenvectors form a basis for the desired -dimensional eigenspace and are numerically orthogonal to working precision.
Implicit restart provides a means to extract interesting
information from large Krylov subspaces while avoiding the
storage and numerical difficulties associated with the standard approach.
It does this by continually compressing the interesting information
into a fixed-size -dimensional subspace. This is accomplished
through the implicitly shifted QR mechanism. A Lanczos factorization
of length ,
A V_m = V_m T_m + r_m e_m^ ,
is compressed to a factorization of length that retains
the eigeninformation of interest. This is accomplished by
applying implicitly shifted
QR steps. The first stage of
this shift process results in
A V_m^+ = V_m^+ T_m^+ + r_m e_m^ Q ,
Each is the orthogonal matrix
associated with the shift used during the shifted QR algorithm.
Because of the Hessenberg structure of
the matrices , it turns out that the first
entries of the vector are zero.
This implies that the leading columns in equation (4.21)
remain in a Lanczos relation. Equating the first
columns on both sides of (4.21) provides an
updated -step Lanczos factorization
A template for the implicitly restarted Lanczos method (IRLM) is given in Algorithm 4.7.
We will now describe some implementation details, referring to the respective phases in Algorithm 4.7.