Chapter 13: Problem 51
Suppose \(T_{1}\) and \(T_{2}\) are self-adjoint. Show that \(T_{1} T_{2}\) is self- adjoint if and only if \(T_{1}\) and \(T_{1}\) commute; that is, \(T_{1} T_{2}=T_{2} T_{1}\)
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 13: Problem 51
Suppose \(T_{1}\) and \(T_{2}\) are self-adjoint. Show that \(T_{1} T_{2}\) is self- adjoint if and only if \(T_{1}\) and \(T_{1}\) commute; that is, \(T_{1} T_{2}=T_{2} T_{1}\)
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(V\) be an inner product space. Recall that each \(u \in V\) determines a linear functional \(\hat{u}\) in the dual space \(V^{*}\) by the definition \(\hat{u}(v)=\langle v, u\rangle\) for every \(v \in V\). (See the text immediately preceding Theorem 13.3 .) Show that the map \(u \mapsto \hat{u}\) is linear and nonsingular, and hence an isomorphism from \(V\) onto \(V^{*}\)
Prove that the products and inverses of orthogonal matrices are orthogonal. (Thus, the orthogonal matrices form a group under multiplication, called the orthogonal group.)
Let \(T\) be a symmetric operator. Show that (a) The characteristic polynomial \(\Delta(t)\) of \(T\) is a product of linear polynomials (over \(\mathbf{R})\); (b) \(T\) has a nonzero eigenvector. (a) Let \(A\) be a matrix representing \(T\) relative to an orthonormal basis of \(V\); then \(A=A^{T}\). Let \(\Delta(t)\) be the characteristic polynomial of \(A\). Viewing \(A\) as a complex self-adjoint operator, \(A\) has only real eigenvalues by Theorem 13.4. Thus, $$ \Delta(t)=\left(t-\lambda_{1}\right)\left(t-\lambda_{2}\right) \cdots\left(t-\lambda_{n}\right) $$ where the \(\lambda_{i}\) are all real. In other words, \(\Delta(t)\) is a product of linear polynomials over \(\mathbf{R}\). (b) By (a), \(T\) has at least one (real) eigenvalue. Hence, \(T\) has a nonzero eigenvector.
Detcrmine which of the following matrices are positive (positive definitc): (i) \(\left[\begin{array}{ll}1 & 1 \\ 1 & 1\end{array}\right],(\text { ii })\left[\begin{array}{rr}0 & i \\ -i & 0\end{array}\right],\) (iii) \(\left[\begin{array}{rr}0 & 1 \\ -1 & 0\end{array}\right],\) (iv) \(\left[\begin{array}{ll}1 & 1 \\ 0 & 1\end{array}\right],\left(\begin{array}{ll}v & 1 \\ 1 & 2\end{array}\right],\) (vi) \(\left[\begin{array}{ll}1 & 2 \\ 2 & 1\end{array}\right]\)
Suppose \(T\) is self-adjoint. Show that \(T^{2}(v)=0\) implies \(T(v)=0 .\) Using this to prove that \(T^{n}(v)=0\) also implics that \(T(v)=0\) for \(n>0\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.