next up previous contents index
Next: What Is a Template? Up: Introduction Previous: Intended Readership   Contents   Index


Using the Decision Tree to Choose a Template

Eigenvalue problems and their associated numerical algorithms may be classified according to the following properties:

  1. Mathematical properties of the problem: Is it a Hermitian (real symmetric, self-adjoint) or a non-Hermitian eigenproblem? Is it a standard problem involving only one matrix or a generalized problem with two matrices? Based on these and other questions, Chapter 2 identifies six mathematical types of eigenvalue problems. This is the top of the decision tree. Each of Chapters 4 through 9 treats one of the six types.

  2. Desired spectral information: Do we need just the smallest eigenvalue, a few eigenvalues at either end of the spectrum, a subset of eigenvalues ``inside'' the spectrum, or most eigenvalues? Do we want associated eigenvectors, invariant subspaces, or other quantities? How much accuracy is desired?

  3. Available operations and their costs: Can we store the full matrix as an array and perform a similarity transformation on it? Can we solve a linear system with the matrix (or shifted matrix) by a direct factorization routine, or perhaps an iterative method? Or can we only multiply a vector by the matrix, and perhaps by its transpose? If several of these operations are possible, what are their relative costs?

Based on the desired spectral information and available operations and their costs, the first sections of Chapters 4 through 9 make recommendations on choosing one of the algorithms they present. These recommendations are summarized in tabular form when appropriate; see Tables 4.1, 5.1, and 7.1.

The state of the art is such that we do not have efficient algorithms for all eigenvalue problems that the user might pose. For example, suppose the user wants all the eigenvalues nearest the imaginary axis of an $n$-by-$n$ matrix $A$ with eigenvalues throughout the complex plane, but where the only available operation is to multiply $A$ by a vector. Then we do not have an algorithm that is simultaneously efficient (i.e., performs many fewer than $n$ matrix-vector multiplications) and reliable (i.e., guaranteed or very likely to find all eigenvalues within a user-specified distance of the imaginary axis). These limitations are discussed in the text.


next up previous contents index
Next: What Is a Template? Up: Introduction Previous: Intended Readership   Contents   Index
Susan Blackford 2000-11-20