Chapter 10: Problem 48
Suppose \(A\) is a supertriangular matrix (i.e., all entries on and below the main diagonal are 0 ). Show that \(A\) is nilpotent.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 10: Problem 48
Suppose \(A\) is a supertriangular matrix (i.e., all entries on and below the main diagonal are 0 ). Show that \(A\) is nilpotent.
All the tools & learning materials you need for study success - in one app.
Get started for free
Show that \(V=W_{1} \oplus \cdots \oplus W_{r}\) if and only if (i) \(V=\operatorname{span}\left(W_{i}\right)\) and (ii) for \(k=1,2, \ldots, r\) \(W_{k} \cap \operatorname{span}\left(W_{1}, \ldots, W_{k-1}, W_{k+1}, \ldots, W_{r}\right)=\\{0\\}\).
Prove Theorem 10.9: A linear operator \(T: V \rightarrow V\) has a diagonal matrix representation if and only if its minimal polynomal \(m(t)\) is a product of distinct linear polynomials. Suppose \(m(t)\) is a product of distinct linear polynomials, say, \\[ m(t)=\left(t-\lambda_{1}\right)\left(t-\lambda_{2}\right) \cdots\left(t-\lambda_{r}\right) \\] where the \(\lambda_{i}\) are distinct scalars. By the Primary Decomposition Theorem, \(V\) is the direct sum of subspaces \(W_{1}, \ldots, W_{r},\) where \(W_{i}=\operatorname{Ker}\left(T-\lambda_{i} I\right) .\) Thus, if \(v \in W_{i},\) then \(\left(T-\lambda_{i} I\right)(v)=0\) or \(T(v)=\lambda_{i} v .\) In other words, every vector in \(W_{i}\) is an eigenvector belonging to the eigenvalue \(\lambda_{i}\). By Theorem 10.4 , the union of bases for \(W_{1}, \ldots, W_{r}\) is a basis of \(V\). This basis consists of eigenvectors, and so \(T\) is diagonalizable. Conversely, suppose \(T\) is diagonalizable (i.e., \(V\) has a basis consisting of eigenvectors of \(T\) ). Let \(\lambda_{1}, \ldots, \lambda_{s}\) be the distinct eigenvalues of \(T .\) Then the operator \\[ f(T)=\left(T-\lambda_{1} I\right)\left(T-\lambda_{2} I\right) \cdots\left(T-\lambda_{s} I\right) \\] maps each basis vector into \(0 .\) Thus, \(f(T)=0,\) and hence, the minimal polynomial \(m(t)\) of \(T\) divides the polynomial \\[ f(t)=\left(t-\lambda_{1}\right)\left(t-\lambda_{2}\right) \cdots\left(t-\lambda_{s} I\right) \\] Accordingly, \(m(t)\) is a product of distinct linear polynomials.
Let \(W\) be a subspace of \(V\). Suppose the set of cosets \(\left\\{v_{1}+W, \quad v_{2}+W, \ldots, v_{n}+W\right\\}\) in \(V / W\) is linearly independent. Show that the set of vectors \(\left\\{v_{1}, v_{2}, \ldots, v_{n}\right\\}\) in \(V\) is also linearly independent.
Let \(T: V \rightarrow V\) be linear. Let \(W\) be a \(T\) -invariant subspace of \(V\) and \(\bar{T}\) the induced operator on \(V / W\). Prove (a) The T-annihilator of \(v \in V\) divides the minimal polynomial of \(T\) (b) The \(\bar{T}\) -annihilator of \(\bar{v} \in V / W\) divides the minimal polynomial of \(T\) (a) The \(T\) -annihilator of \(v \in V\) is the minimal polynomial of the restriction of \(T\) to \(Z(v, T) ;\) therefore, by Problem \(10.6,\) it divides the minimal polynomial of \(T\) (b) The \(\bar{T}\) -annihilator of \(\bar{v} \in V / W\) divides the minimal polynomial of \(\bar{T},\) which divides the minimal polynomial of \(T\) by Theorem 10.16 Remark: In the case where the minimum polynomial of \(T\) is \(f(t)^{n},\) where \(f(t)\) is a monic irreducible polynomial, then the \(T\) -annihilator of \(v \in V\) and the \(\bar{T}\) -annihilator of \(\bar{v} \in V / W\) are of the form \(f(t)^{m},\) where \(m \leq n\).
Prove Theorem 10.12: Let \(Z(v, T)\) be a \(T\) -cyclic subspace, \(T_{v}\) the restriction of \(T\) to \(Z(v, T),\) and \(m_{v}(t)=t^{k}+a_{k-1} t^{k-1}+\cdots+a_{0}\) the \(T\) -annihilator of \(v .\) Then (i) The set \(\left\\{v, T(v), \ldots, T^{k-1}(v)\right\\}\) is a basis of \(Z(v, T) ;\) hence, \(\operatorname{dim} Z(v, T)=k\) (ii) The minimal polynomial of \(T_{v}\) is \(m_{v}(t)\) (iii) The matrix of \(T_{v}\) in the above basis is the companion matrix \(C=C\left(m_{v}\right)\) of \(m_{v}(t)\) [which has 1 's below the diagonal, the negative of the coefficients \(a_{0}, a_{1}, \ldots, a_{k-1}\) of \(m_{v}(t)\) in the last column, and \(0 \text { 's elsewhere }]\). (i) \(\quad\) By definition of \(m_{v}(t), T^{k}(v)\) is the first vector in the sequence \(v, T(v), T^{2}(v), \ldots\) that, is a linear combination of those vectors that precede it in the sequence; hence, the set \(B=\left\\{v, T(v), \ldots, T^{k-1}(v)\right\\}\) is linearly independent. We now only have to show that \(Z(v, T)=L(B)\), the linear span of \(B\). By the above, \(T^{k}(v) \in L(B) .\) We prove by induction that \(T^{n}(v) \in L(B)\) for every \(n .\) Suppose \(n>k\) and \(T^{n-1}(v) \in L(B)-\) that \(\quad\) is, \(\quad T^{n-1}(v) \quad\) is \(\quad\) a \(\quad\) linear \(\quad\) combination \(\quad\) of \(\quad v, \ldots, T^{k-1}(v) . \quad\) Then \(T^{n}(v)=T\left(T^{n-1}(v)\right)\) is a linear combination of \(T(v), \ldots, T^{k}(v) .\) But \(T^{k}(v) \in L(B) ;\) hence, \(T^{n}(v) \in L(B)\) for every \(n .\) Consequently, \(f(T)(v) \in L(B)\) for any polynomial \(f(t) .\) Thus, \(Z(v, T)=L(B),\) and \(\operatorname{so~} B\) is a basis, as claimed. (ii) Suppose \(m(t)=t^{s}+b_{s-1} t^{s-1}+\cdots+b_{0}\) is the minimal polynomial of \(T_{v} .\) Then, because \(v \in Z(v, T)\) \\[ 0=m\left(T_{v}\right)(v)=m(T)(v)=T^{s}(v)+b_{s-1} T^{s-1}(v)+\cdots+b_{0} v \\] Thus, \(T^{s}(v)\) is a linear combination of \(v, T(v), \ldots, T^{s-1}(v),\) and therefore \(k \leq s\). However, \(m_{v}(T)=\mathbf{0}\) and so \(m_{v}\left(T_{v}\right)=\mathbf{0} .\) Then \(m(t)\) divides \(m_{v}(t),\) and so \(s \leq k .\) Accordingly, \(k=s\) and hence \(m_{v}(t)=m(t)\) (iii) \\[ \begin{array}{ll} T_{v}(v) & =\quad T(v) \\ T_{v}(T(v)) & = \\ & = \\ T_{v}\left(T^{k-2}(v)\right) & = \\ T_{v}\left(T^{k-1}(v)\right) & =T^{k}(v)=-a_{0} v-a_{1} T(v)-a_{2} T^{2}(v)-\cdots-a_{k-1} T^{k-1}(v) \end{array} \\] By definition, the matrix of \(T_{v}\) in this basis is the tranpose of the matrix of coefficients of the above system of equations; hence, it is \(C,\) as required.
What do you think about this solution?
We value your feedback to improve our textbook solutions.