Chapter 3: Problem 4
What is the dimension of the vector space \(\mathcal{M}_{m \times n}\) ? Give a basis.
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 3: Problem 4
What is the dimension of the vector space \(\mathcal{M}_{m \times n}\) ? Give a basis.
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
The subspace \(\ell^{2} \subset \mathbb{R}^{\omega}\) defined by \(\ell^{2}=\left\\{\mathbf{x} \in \mathbb{R}^{\omega}: \sum_{k=1}^{\infty} x_{k}^{2}\right.\) exists \(\\}\) is an inner product space with inner product defined by \(\langle\mathbf{x}, \mathbf{y}\rangle=\sum_{k=1}^{\infty} x_{k} y_{k}\). (That this sum makes sense follows by "taking the limit" of the Cauchy-Schwarz Inequality.) Let $$ V=\left\\{\mathbf{x} \in \ell^{2}: \text { there is an integer } n \text { such that } x_{k}=0 \text { for all } k>n\right\\} . $$ Show that \(V\) is a (proper) subspace of \(\ell^{2}\) and that \(V^{\perp}=\\{0\\}\). It follows that \(\left(V^{\perp}\right)^{\perp}=\ell^{2}\), so Proposition \(3.6\) need not hold in infinite-dimensional spaces.
Let \(A\) be an \(m \times n\) matrix with rank \(r\). Suppose \(A=B U\), where \(U\) is in echelon form. Show that the first \(r\) columns of \(B\) give a basis for \(\mathbf{C}(A)\). (In particular, if \(E A=U\), where \(U\) is the echelon form of \(A\) and \(E\) is the product of elementary matrices by which we reduce \(A\) to \(U\), then the first \(r\) columns of \(E^{-1}\) give a basis for \(\mathbf{C}(A)\).)
Suppose \(\mathbf{v}_{1}, \ldots, \mathbf{v}_{k} \in \mathbb{R}^{n}\) form a linearly dependent set. Prove that either \(\mathbf{v}_{1}=\mathbf{0}\) or \(\mathbf{v}_{i} \in \operatorname{Span}\left(\mathbf{v}_{1}, \ldots, \mathbf{v}_{i-1}\right)\) for some \(i=2,3, \ldots, k\). (Hint: There is a relation \(c_{1} \mathbf{v}_{1}+\) \(c_{2} \mathbf{v}_{2}+\cdots+c_{k} \mathbf{v}_{k}=\mathbf{0}\) with at least one \(c_{j} \neq 0\). Consider the largest such \(j\).)
Continuing Exercise 3.2.10: Let \(A\) be an \(m \times n\) matrix. a. Use Theorem \(2.5\) to prove that \(\mathbf{N}\left(A^{\top} A\right)=\mathbf{N}(A)\). (Hint: If \(\mathbf{x} \in \mathbf{N}\left(A^{\top} A\right)\), then \(\left.A \mathbf{x} \in \mathbf{C}(A) \cap \mathbf{N}\left(A^{\top}\right) .\right)\) b. Prove that \(\operatorname{rank}(A)=\operatorname{rank}\left(A^{\top} A\right)\). c. Prove that \(\mathbf{C}\left(A^{\mathrm{T}} A\right)=\mathbf{C}\left(A^{\mathrm{T}}\right)\).
Let \(g_{1}(t)=t-1\) and \(g_{2}(t)=t^{2}+t\). Using the inner product on \(\mathcal{P}_{2} \subset \bigodot^{0}([0,1])\) defined in Example \(10(\mathrm{c})\), find the orthogonal complement of Span \(\left(g_{1}, g_{2}\right)\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.