Chapter 12: Problem 14
Show how to compute the eigenvalues of an orthogonal matrix \(A \in \mathbf{R}^{n \times n}\) by computing the Schur decompositions of \(\left(A+A^{T}\right) / 2\) and \(\left(A-A^{T}\right) / 2\).
Short Answer
Expert verified
The eigenvalues of the orthogonal matrix \( A \) have unit modulus.
Step by step solution
01
Understand Orthogonal Matrix Properties
An orthogonal matrix \( A \) is a real square matrix where \( A^T A = I \), where \( I \) is the identity matrix. Its eigenvalues have modulus 1.
02
Form the Symmetric Matrix
Compute \( B = \frac{A + A^T}{2} \). This matrix is symmetric because \( (B)^T = B \). Symmetric matrices have real eigenvalues.
03
Find the Skew-Symmetric Matrix
Compute \( C = \frac{A - A^T}{2} \). This matrix is skew-symmetric as \( (C)^T = -C \). Eigenvalues of skew-symmetric matrices are purely imaginary or zero.
04
Use Schur Decomposition on Symmetric Matrix
Decompose the symmetric matrix \( B \) using the Schur decomposition: \( B = Q_T R_T Q_T^T \), where \( Q_T \) is orthogonal and \( R_T \) is upper triangular. Since \( B \) is symmetric, \( R_T \) will be diagonal with real entries (the eigenvalues of \( B \)).
05
Use Schur Decomposition on Skew-Symmetric Matrix
Decompose the skew-symmetric matrix \( C \) using the Schur decomposition: \( C = Q_S R_S Q_S^T \) with \( Q_S \) orthogonal and \( R_S \) upper triangular. Since \( C \) is skew-symmetric, the eigenvalues of \( R_S \) will be purely imaginary or zero.
06
Combine the Results to Find Eigenvalues of A
The eigenvalues of \( A \) are derived from the eigenvalues of \( B + iC \), \( i \) being the imaginary unit. They are of the form \( \lambda = \cos(\theta) + i\sin(\theta) \), as \( B + iC \) represents a rotation, ensuring all \( \lambda \) have modulus 1.
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Schur Decomposition
The Schur decomposition is a valuable tool in linear algebra for simplifying matrix operations. It allows the transformation of any square matrix into a form where it can be easily analyzed.
For any square matrix, the Schur decomposition can be represented as:
For any square matrix, the Schur decomposition can be represented as:
- Matrix \( A \) is decomposed into \( A = Q T Q^T \),
- where \( Q \) is an orthogonal matrix containing the eigenvectors, and
- \( T \) is an upper triangular matrix containing the eigenvalues of \( A \).
Symmetric Matrices
Symmetric matrices are a special category of matrices where the matrix is equal to its own transpose. In mathematical terms, a square matrix \( B \) is symmetric if \( B = B^T \).
Some key characteristics of symmetric matrices include:
Some key characteristics of symmetric matrices include:
- Real eigenvalues: The eigenvalues of a symmetric matrix are always real numbers.
- Diagonalizability: Symmetric matrices are always diagonalizable. This means they can be represented in the form \( B = Q \, \Lambda \, Q^T \), where \( Q \) is orthogonal and \( \Lambda \) is a diagonal matrix of eigenvalues.
- Positive Semi-Definiteness: Symmetric matrices have important implications in optimization, especially in determining types of critical points.
Skew-Symmetric Matrices
Skew-symmetric matrices, in contrast to symmetric matrices, satisfy the property \( C = -C^T \). This means that each element is the negative of its transpose counterpart. Some distinctive traits of skew-symmetric matrices include:
- Purely Imaginary Eigenvalues: Skew-symmetric matrices have eigenvalues that are purely imaginary, except for the possibility of zero.
- Odd-Dimensional Zero Sum: In the case of skew-symmetric matrices with an odd dimension, they must have at least one zero eigenvalue.
- Characteristic in Rotations: In mechanics and other applied fields, skew-symmetric matrices often define rotational transformations.
Orthogonal Matrices
Orthogonal matrices are special matrices where the rows and columns are orthonormal vectors. An orthogonal matrix \( A \) fulfills the condition \( A^T A = I \), where \( I \) is the identity matrix. This property ensures several interesting features:
- Preservation of Vector Norms: Multiplying a vector by an orthogonal matrix does not change the vector's length.
- Eigenvalues of Unit Magnitude: The eigenvalues of an orthogonal matrix have a modulus of one, typically of the form \( e^{i\theta} \).
- Stability in Computations: In numerical algorithms, orthogonal matrices help maintain stability and prevent errors from growing.
- Applications in 3D Rotations: Commonly used to represent rotations in three-dimensional space.