Chapter 5: Problem 36
a. Prove that the eigenvalues of an \(n \times n\) real matrix \(A\) are the same as the eigenvalues of \(A^{T}\). b. With reference to part (a), show by a counterexample that an eigenvector of \(A\) need not be an eigenvector of \(A^{T}\).
Short Answer
Expert verified
Eigenvalues of \(A\) and \(A^T\) are the same, but eigenvectors can differ; \(A = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}\) is a counterexample.
Step by step solution
01
Define Eigenvalues
An eigenvalue of a matrix \(A\) is a scalar \(\lambda\) such that there exists a non-zero vector \(\mathbf{x}\) satisfying the equation \(A\mathbf{x} = \lambda \mathbf{x}\).
02
Define Characteristic Polynomial
The eigenvalues of a matrix \(A\) are the roots of its characteristic polynomial, which is given by \(\det(A - \lambda I) = 0\), where \(I\) is the identity matrix.
03
Relation with Transpose
For the purposes of comparing eigenvalues, we note that \(A\) and its transpose \(A^T\) have the same characteristic polynomial, because \(\det(A - \lambda I) = \det(A^T - \lambda I)\). This implies that \(A\) and \(A^T\) have the same eigenvalues.
04
Counterexample Setup
To demonstrate that an eigenvector of \(A\) is not necessarily an eigenvector of \(A^T\), consider a specific counterexample where this is the case, such as matrix \(A = \begin{bmatrix} 0 & 1 \ 0 & 0 \end{bmatrix}\).
05
Eigenvector for A
For matrix \(A = \begin{bmatrix} 0 & 1 \ 0 & 0 \end{bmatrix}\), solving \(A\mathbf{x} = \lambda \mathbf{x}\) shows that \(\lambda = 0\) and a corresponding eigenvector is \(\mathbf{x} = \begin{bmatrix} 1 \ 0 \end{bmatrix}\).
06
Eigenvector for A^T
Now consider \(A^T = \begin{bmatrix} 0 & 0 \ 1 & 0 \end{bmatrix}\). Solving \(A^T\mathbf{x} = \lambda \mathbf{x}\) shows that \(\lambda = 0\) and the corresponding eigenvector cannot be \(\begin{bmatrix} 1 \ 0 \end{bmatrix}\). Instead, it is \(\begin{bmatrix} 0 \ 1 \end{bmatrix}\).
07
Conclusion of Counterexample
This counterexample confirms that although \(A\) and \(A^T\) have the same eigenvalues, \(\begin{bmatrix} 1 \ 0 \end{bmatrix}\) is an eigenvector of \(A\) for \(\lambda=0\) but not of \(A^T\). Similarly, \(\begin{bmatrix} 0 \ 1 \end{bmatrix}\) is an eigenvector of \(A^T\) for \(\lambda=0\) but not of \(A\).
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Eigenvectors
An eigenvector is a special vector associated with a matrix. It is non-zero and when the matrix acts on it, the vector is simply scaled by a factor called the eigenvalue. This is mathematically represented as:
- For a matrix \( A \) and scalar \( \lambda \), a vector \( \mathbf{x} \) is an eigenvector if \( A\mathbf{x} = \lambda \mathbf{x} \).
- The eigenvector does not change direction under the transformation described by the matrix.
- Eigenvectors are only defined up to a scalar multiple; multiplying an eigenvector by any non-zero scalar yields another eigenvector.
Real Matrix
A real matrix is a matrix whose entries are all real numbers. It plays an important role in various fields such as physics, engineering, and statistics.
- A matrix is defined by its dimensions, indicated as \( m \times n \), where \( m \) and \( n \) are the number of rows and columns, respectively.
- In the context of eigenvalues and eigenvectors, real matrices are particularly interesting because their properties allow for real eigenvalues and corresponding eigenvectors, although complex pairs may also occur.
- The behavior of real matrices helps predict dynamic systems, solve differential equations, and transform geometric spaces, making them fundamental in both theoretical and applied mathematics.
Transpose of a Matrix
The transpose of a matrix \( A \), denoted as \( A^T \), is formed by flipping \( A \) over its diagonal. This means that the element at the \(i, j\)-th position in \( A \) becomes the element at the \(j, i\)-th position in \( A^T \).
- This operation is useful in various contexts, particularly in solving systems of linear equations, as it often helps in simplifying the equations involved.
- An important property of the transpose is that for any two matrices \( A \) and \( B \), the transpose of the product \( (AB)^T \) is equal to \( B^T A^T \).
- When considering eigenvalues, the transpose of a real square matrix has the same eigenvalues as the original matrix, due to having the same characteristic polynomial.
Characteristic Polynomial
The characteristic polynomial of a matrix \( A \) is an essential tool for finding its eigenvalues. It is a polynomial derived from the matrix by subtracting \( \lambda \) times the identity matrix from \( A \), and then taking the determinant.
- Expressed as \( \det(A - \lambda I) = 0 \), where \( I \) is the identity matrix, this polynomial's roots are the eigenvalues of the matrix.
- For a \( n \times n \) matrix, the characteristic polynomial is of degree \( n \), thus there can be up to \( n \) eigenvalues, considering complex numbers and their multiplicities.
- The characteristic polynomial is pivotal in proving the relations between eigenvalues of a matrix and its transpose.