Chapter 10: Problem 64
Prove that two \(3 \times 3\) matrices with the same minimal and characteristic polynomials are similar.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 10: Problem 64
Prove that two \(3 \times 3\) matrices with the same minimal and characteristic polynomials are similar.
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(V\) be a vector space of odd dimension (greater than 1 ) over the real field \(\mathbf{R}\). Show that any linear operator on \(V\) has an invariant subspace other than \(V\) or \(\\{0\\}\).
Suppose dim \(V=n .\) Show that \(T: V \rightarrow V\) has a triangular matrix representation if and only if there exist \(T\) -invariant subspaces \(W_{1} \subset W_{2} \subset \cdots \subset W_{n}=V\) for which \(\operatorname{dim} W_{k}=k, k=1, \ldots, n\).
Prove the following: The cosets of \(W\) in \(V\) partition \(V\) into mutually disjoint sets. That is, (a) Any two cosets \(u+W\) and \(v+W\) are either identical or disjoint. (b) Each \(v \in V\) belongs to a coset; in fact, \(v \in v+W\) Furthermore, \(u+W=v+W\) if and only if \(u-v \in W,\) and so \((v+w)+W=v+W\) for any \(w \in W\) Let \(v \in V .\) Because \(0 \in W,\) we have \(v=v+0 \in v+W,\) which proves (b). Now suppose the cosets \(u+W\) and \(v+W\) are not disjoint; say, the vector \(x\) belongs to both \(u+W\) and \(v+W\). Then \(u-x \in W\) and \(x-v \in W\). The proof of (a) is complete if we show that \(u+W=v+W\) Let \(u+w_{0}\) be any element in the coset \(u+W\). Because \(u-x, x-v, w_{0}\) belongs to \(W\) \\[ \left(u+w_{0}\right)-v=(u-x)+(x-v)+w_{0} \in W \\] Thus, \(u+w_{0} \in v+W,\) and hence the cost \(u+W\) is contained in the coset \(v+W\). Similarly, \(v+W\) is contained in \(u+W\), and so \(u+W=v+W\) The last statement follows from the fact that \(u+W=v+W\) if and only if \(u \in v+W\), and, by Problem \(10.21,\) this is equivalent to \(u-v \in W\).
Suppose \(E: V \rightarrow V\) is linear and \(E^{2}=E .\) Show that \((\mathrm{a}) E(u)=u\) for any \(u \in \operatorname{Im} E\) (i.e., the restriction of \(E \text { to its image is the identity mapping }) ;\) (b) \(V\) is the direct sum of the image and kernel of \(E: V=\operatorname{Im} E \oplus \operatorname{Ker} E ;\) (c) \(E\) is the projection of \(V\) into \(\operatorname{Im} E,\) its image. Thus, by the preceding problem, a linear mapping \(T: V \rightarrow V\) is a projection if and only if \(T^{2}=T ;\) this characterization of a projection is frequently used as its definition.
Prove Theorem 10.4: Suppose \(W_{1}, W_{2}, \ldots, W_{r}\) are subspaces of \(V\) with respective bases \\[ B_{1}=\left\\{w_{11}, w_{12}, \ldots, w_{1 n_{1}}\right\\}, \quad \ldots, \quad B_{r}=\left\\{w_{r 1}, w_{r 2}, \ldots, w_{r n_{r}}\right\\} \\] Then \(V\) is the direct sum of the \(W_{i}\) if and only if the union \(B=\bigcup_{i} B_{i}\) is a basis of \(V\) Suppose \(B\) is a basis of \(V\). Then, for any \(v \in V\) \\[ v=a_{11} w_{11}+\cdots+a_{1 n_{1}} w_{1 n_{1}}+\cdots+a_{r 1} w_{r 1}+\cdots+a_{m_{r}} w_{m_{r}}=w_{1}+w_{2}+\cdots+w_{r} \\] where \(w_{i}=a_{i 1} w_{i 1}+\cdots+a_{i n_{i}} w_{i n_{i}} \in W_{i} .\) We next show that such a sum is unique. Suppose \\[ v=w_{1}^{\prime}+w_{2}^{\prime}+\cdots+w_{r}^{\prime}, \quad \text { where } \quad w_{i}^{\prime} \in W_{i} \\] Because \(\left\\{w_{i 1}, \ldots, w_{i n}\right\\}\) is a basis of \(W_{i}, w_{i}^{\prime}=b_{i 1} w_{i 1}+\cdots+b_{i n_{i}} w_{i n_{i}},\) and so \\[ v=b_{11} w_{11}+\dots+b_{1 n_{1}} w_{1 n_{1}}+\cdots+b_{r 1} w_{r 1}+\cdots+b_{m_{r}} w_{r m_{r}} \\] Because \(B\) is a basis of \(V, a_{i j}=b_{i j},\) for each \(i\) and each \(j .\) Hence, \(w_{i}=w_{i}^{\prime},\) and so the sum for \(v\) is unique. Accordingly, \(V\) is the direct sum of the \(W_{i}\) Conversely, suppose \(V\) is the direct sum of the \(W_{i}\). Then for any \(v \in V, v=w_{1}+\cdots+w_{r},\) where \(w_{i} \in W_{i} .\) Because \(\left\\{w_{i j_{i}}\right\\}\) is a basis of \(W_{i},\) each \(w_{i}\) is a linear combination of the \(w_{i j},\) and so \(v\) is a linear combination of the elements of \(B\). Thus, \(B\) spans \(V\). We now show that \(B\) is linearly independent. Suppose \\[ a_{11} w_{11}+\dots+a_{1 n_{1}} w_{1 n_{1}}+\cdots+a_{r 1} w_{r 1}+\cdots+a_{r n_{r}} w_{r n_{r}}=0 \\] Note that \(a_{i 1} w_{i 1}+\cdots+a_{i n_{i}} w_{i n_{i}} \in W_{i} .\) We also have that \(0=0+0 \cdots 0 \in W_{i} .\) Because such a sum for 0 is unique, \\[ a_{i 1} w_{i 1}+\cdots+a_{i n_{i}} w_{i n_{i}}=0 \quad \text { for } i=1, \ldots, r \\] The independence of the bases \(\left\\{w_{i j}\right\\}\) implies that all the \(a\) 's are \(0 .\) Thus, \(B\) is linearly independent and is a basis of \(V\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.