Chapter 2: Problem 12
Let \(V\) be a finite-dimensional vector space with the ordered basis \(\beta\). Prove that \(\psi(\beta)=\beta^{* *}\), where \(\psi\) is defined in Theorem \(2.26\).
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 2: Problem 12
Let \(V\) be a finite-dimensional vector space with the ordered basis \(\beta\). Prove that \(\psi(\beta)=\beta^{* *}\), where \(\psi\) is defined in Theorem \(2.26\).
All the tools & learning materials you need for study success - in one app.
Get started for free
Suppose \(A\) and \(B\) are orthogonal matrices. Show that \(A^{T}, A^{-1}, A B\) are also orthogonal.
Let \(A\) be an \(n \times n\) matrix. (a) Suppose that \(A^{2}=O .\) Prove that \(A\) is not invertible. (b) Suppose that \(A B=O\) for some nonzero \(n \times n\) matrix \(B\). Could \(A\) be invertible? Explain.
Let \(V\) be a finite-dimensional vector space, and let \(T: V \rightarrow V\) be linear. (a) If \(\operatorname{rank}(T)=\operatorname{rank}\left(T^{2}\right)\), prove that \(R(T) \cap N(T)=\\{0\\}\). Deduce that $\mathrm{V}=\mathrm{R}(\mathrm{T}) \oplus \mathrm{N}(\mathrm{T})$ (see the exercises of Section 1.3). (b) Prove that $\mathrm{V}=\mathrm{R}\left(\mathrm{T}^{k}\right) \oplus \mathrm{N}\left(\mathrm{T}^{k}\right)\( for some positive integer \)k$.
Assume the notation in Theorem \(2.13 .\) (a) Suppose that \(z\) is a (column) vector in \(\mathrm{F}^{p}\). Use Theorem 2.13(b) to prove that \(B z\) is a linear combination of the columns of \(B\). In particular, if \(z=\left(a_{1}, a_{2}, \ldots, a_{p}\right)^{t}\), then show that $$ B z=\sum_{j=1}^{p} a_{j} v_{j} . $$ (b) Extend (a) to prove that column \(j\) of \(A B\) is a linear combination of the columns of \(A\) with the coefficients in the linear combination being the entries of column \(j\) of \(B\). (c) For any row vector \(w \in \mathrm{F}^{m}\), prove that \(w A\) is a linear combination of the rows of \(A\) with the coefficients in the linear combination being the coordinates of \(w\). Hint: Use properties of the transpose operation applied to (a). (d) Prove the analogous result to (b) about rows: Row \(i\) of \(A B\) is a linear combination of the rows of \(B\) with the coefficients in the linear combination being the entries of row \(i\) of \(A\).
Suppose \(U=\left[U_{i k}\right]\) and \(V=\left[V_{k j}\right]\) are block matrices for which \(U V\) is defined and the number of columns of each block \(U_{i k}\) is equal to the number of rows of each block \(V_{k j}\). Show that \(U V=\left[W_{i j}\right]\) where \(W_{i j}=\sum_{k} U_{i k} V_{k j}\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.