next up previous contents index
Next: Example Up: Singular Value Decomposition  J. Previous: Specifying a Singular Value   Contents   Index


Related Singular Value Problems

  1. If $B = A^*$, and the SVD of $A = U \Sigma V^*$, then the SVD of $B = V \Sigma^* U^*$.

  2. Suppose $B = A^*A$, where $A$ is $m$ by $n$ with $m \geq n$. Then $B$ is an $n$ by $n$ Hermitian matrix. Let $A = U \Sigma V^*$ be the SVD of $A$. Then the eigendecomposition of $B = V (\Sigma^*\Sigma) V^*$. Note that $\Sigma^*\Sigma = {\rm diag}(\sigma_1^2 , \ldots , \sigma_n^2)$. In other words, the eigenvectors of $B$ are the right singular vectors of $A$, and the eigenvalues of $B$ are the squares of the singular values of $A$. If $B = AA^*$, where $A$ is $m$ by $n$ with $m \geq n$, then $B$ is $m$ by $m$ with eigendecomposition $B = U (\Sigma \Sigma^*) U^*$. In other words, the eigenvectors of $B$ are the right singular vectors of $A$, and the eigenvalues of $B$ are the squares of the singular values of $A$, in addition to 0 with additional multiplicity $m-n$.

  3. Suppose $B = [{0^{m \times m} \atop A^*} \; {A \atop 0^{n \times n}}]$, where $A$ is $m$ by $n$ with $m \geq n$. Then $B$ is an $(m+n)$ by $(m+n)$ Hermitian matrix. Let $A = U \Sigma V^*$ be the SVD of $A$. Write $U = [U_1^{m \times n}, U_2^{m \times (m-n)}]$ and $\Sigma = [{\hat{\Sigma}^{n \times n} \atop 0^{(m-n) \times n}}]$. Then the eigendecomposition of $B = Q \Lambda Q^*$, where $\Lambda = {\rm diag}( \hat{\Sigma} , -\hat{\Sigma} , O^{(m-n) \times (m-n)})$ and

    \begin{displaymath}
Q = \bmat{ccc}
\frac{1}{\sqrt{2}} U_1 & - \frac{1}{\sqrt{2...
...{2}} V & + \frac{1}{\sqrt{2}} V & O^{n \times (m-n)}
\emat.
\end{displaymath}

    In other words, $\pm \sigma_i$ is an eigenvalue with unit eigenvector $\frac{1}{\sqrt{2}} [{\pm u_i \atop v_i}]$ for $i=1$ to $n$, and 0 is an eigenvalue with eigenvector $[{u_i \atop 0^{n \times 1}}]$ for $i>n$.

  4. The QSVD or generalized SVD (GSVD) of $A$ and $B$ is defined as follows. Suppose $A$ is $m$ by $n$ and $B$ is $n$ by $n$ and nonsingular. Let $r = \min(m,n)$. Let the SVD of $AB^{-1} = U \Sigma V^*$. We may also write two equivalent decompositions of $A$ and $B$ as $A = U \Sigma_A X$ and $B = V \Sigma_B X$, where $X$ is $n$ by $n$ and nonsingular, $\Sigma_A$ is $m$ by $n$ and contains ${\rm diag}( \sigma_{A,1},\ldots,\sigma_{A,r} )$ in its leading $r$ rows and columns and zeros elsewhere, and $\Sigma_B$ is $n$ by $n$ and contains ${\rm diag}( \sigma_{B,1},\ldots,\sigma_{B,n} )$.
    $\Sigma_A$ and $\Sigma_B$ can be chosen so that $\Sigma_A^T \Sigma_A + \Sigma_B ^T \Sigma_B = I^{n \times n}$. Thus $\Sigma = \Sigma_A \Sigma_B^{-1}$. The diagonal entries $\sigma_i = \sigma_{A,i} / \sigma_{B,i}$ of $\Sigma$ are called the generalized singular values of $A$ and $B$.

    This decomposition generalizes to the cases where $B$ is singular or $p$ by $n$, to a decomposition equivalent to the SVD of $AB$ (the product SVD), and indeed to decompositions of arbitrary products of the form $A_1^{\pm 1} A_2^{\pm 1} \cdots A_k^{\pm 1}$ [108].

  5. The CS (cosine/sine) decomposition of $A$ and $B$ is defined as follows. Suppose the columns of $Z = [{A^{m \times n} \atop B^{p \times n}}]$ are orthonormal. The QSVD of $A$ and $B$ is then equivalent to the decompositions $A = U \Sigma_A X$ and $B = V \Sigma_B X$, where $X$ is unitary. The diagonal entries of $\Sigma_A$ ($\Sigma_B$) are the cosines (sines) of the principal angles between the subspace spanned by the columns of $Z$ and the subspace spanned by the first $m$ columns of $I^{(m+p) \times (m+p)}$ [424,25]. This is used to compute the ``distance'' (or angles) between subspaces.

  6. Suppose $A$ is $m$-by-$n$, and $B$ is $p$ by $n$ with full column rank; then the GHEP $A^*A - \lambda B^*B$ can be solved using the QSVD as follows. Write the QSVD as $A = U \Sigma_A X$ and $B = V \Sigma_B X$. The eigendecomposition of $A^*A - \lambda B^*B$ may be written $X^*A^*AX = \Sigma_A^* \Sigma_A$ and $X^*B^*BX = \Sigma_B^* \Sigma_B$. This generalizes to the case of $B$ without full column rank.

  7. Finding $x$ that minimizes $\Vert Ax - b \Vert _2$ is the linear least squares problem. Suppose $A$ is $m$ by $n$ with $m>n$; we say that the least squares problem is overdetermined. Let $A = \tilde{U} \tilde{\Sigma} V^*$ be the compact SVD of $A$. If $A$ has full rank, the unique solution of the least squares problem is $x= V {\tilde{\Sigma}}^{-1} {\tilde{U}}^* b$, although solutions based on the QR decomposition or normal equations are cheaper and more commonly used [50,12]. If $A$'s rank is less than $n$, then the SVD is often used to solve the least squares problem. In this case the solution is not unique but its unique minimum norm solution is

    \begin{displaymath}
x= V {\hat{\Sigma}}^{\dag } {\hat{U}}^* b, \quad
\mbox{where...
...a_i > 0, \\
0 & {\rm if\ } \sigma_i = 0. \end{array} \right.
\end{displaymath}

    If $m<n$ we say that the least squares problem is underdetermined. In this case the solution is not unique but its unique minimum norm solution is $x= \hat{V} {\hat{\Sigma}}^{\dag } {U}^* b$, where $A = U \hat{\Sigma} \hat{V}^*$ is the compact SVD of $A$.


next up previous contents index
Next: Example Up: Singular Value Decomposition  J. Previous: Specifying a Singular Value   Contents   Index
Susan Blackford 2000-11-20