In this problem, we establish some facts about eigenvalues and eigenvectors of
square matrices. (For a more general treatment, see, for example, Marshall and
Olkin 1979, Chapter 20.)
We use the facts that a scalar \(\lambda>0\) is an eigenvalue of the \(n \times
n\) symmetric matrix \(A\) if there exists an \(n \times 1\) vector \(p\), the
corresponding eigenvector, satisfying \(A p=\lambda p\). If \(\mathrm{A}\) is
nonsingular, there are \(n\) eigenvalues with corresponding linearly independent
eigenvectors.
(a) Show that \(A=P^{\prime} D_{\lambda} P\), where \(D_{\lambda}\) is a diagonal
matrix of eigenvalues of \(A\) and \(P\) is and \(n \times n\) matrix whose rows are
the corresponding eigenvalues that satisfies \(P^{\prime} P=P P^{\prime}=I\),
the identity matrix.
(b) Show that \(\max _{x} \frac{x^{\prime} A x}{x^{\prime} x}=\) largest
eigenvalue of \(A\).
(c) If \(B\) is a nonsingular symmetric matrix with eigenvector-eigenvalue
representation \(B=Q^{\prime} D_{\beta} Q\), then \(\max _{x} \frac{x^{\prime} A
x}{x^{\prime} B x}=\) largest eigenvalue of \(A^{*}\), where \(A^{*}=\)
\(D_{\beta}^{-1 / 2} Q A Q^{\prime} D_{\beta}^{-1 / 2}\) and \(D_{\beta}^{-1 /
2}\) is a diagonal matrix whose elements are the reciprocals of the square
roots of the eigenvalues of \(B\).
(d) For any square matrices \(C\) and \(D\), show that the eigenvalues of the
matrix \(C D\) are the same as the eigenvalues of the matrix \(D C\), and hence
that \(\max _{x} \frac{x^{\prime} A x}{x^{\prime} B x}=\) largest eigenvalue of
\(A B^{-1}\).
(e) If \(A=a a^{\prime}\), where \(a\) is a \(n \times 1\) vector ( \(A\) is thus a
rank-one matrix), then \(\max _{x} \frac{x^{\prime} a a^{\prime} x}{x^{\prime}
B x}=\) \(a^{\prime} B^{-1} a\).