/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Free solutions & answers for Linear Algebra and its Applications Chapter 6 - (Page 9) [step by step] 978-03219822384 | 91影视

91影视

Chapter 6: Orthogonality and Least Squares

Q1E

Page 331

In Exercises 1-6, the given set is a basis for a subspace W. Use the Gram-Schmidt process to produce an orthogonal basis for W.

  1. \(\left( {\begin{aligned}{{}{}}3\\0\\{ - 1}\end{aligned}} \right),\left( {\begin{aligned}{{}{}}8\\5\\{ - 6}\end{aligned}} \right)\)

Q1E

Page 331

In Exercises 1-4, find a least-sqaures solution of \(A{\bf{x}} = {\bf{b}}\) by (a) constructing a normal equations for \({\bf{\hat x}}\) and (b) solving for \({\bf{\hat x}}\).

1. \(A = \left[ {\begin{aligned}{{}{}}{ - {\bf{1}}}&{\bf{2}}\\{\bf{2}}&{ - {\bf{3}}}\\{ - {\bf{1}}}&{\bf{3}}\end{aligned}} \right]\), \({\bf{b}} = \left[ {\begin{aligned}{{}{}}{\bf{4}}\\{\bf{1}}\\{\bf{2}}\end{aligned}} \right]\)

Q20E

Page 331

Exercises 19 and 20 involve a design matrix \(X\) with two or more columns and a least-squares solution \(\hat \beta \) of \({\bf{y}} = X\beta \). Consider the following numbers.

(i) \({\left\| {X\hat \beta } \right\|^2}\)鈥攖he sum of the squares of the 鈥渞egression term.鈥 Denote this number by \(SS\left( R \right)\).

(ii) \({\left\| {{\bf{y}} - X\hat \beta } \right\|^2}\)鈥攖he sum of the squares for error term. Denote this number by \(SS\left( E \right)\).

(iii) \({\left\| {\bf{y}} \right\|^2}\)鈥攖he 鈥渢otal鈥 sum of the squares of the -values. Denote this number by \(SS\left( T \right)\).

Every statistics text that discusses regression and the linear model \(y = X\beta + \in \) introduces these numbers, though terminology and notation vary somewhat. To simplify matters, assume that the mean of the -values is zero. In this case, \(SS\left( T \right)\) is proportional to what is called the variance of the set of \(y\)-values.

20. Show that \({\left\| {X\hat \beta } \right\|^2} = {\hat \beta ^T}{X^T}{\bf{y}}\). (Hint: Rewrite the left side and use the fact that \(\hat \beta \) satisfies the normal equations.) This formula for is used in statistics. From this and from Exercise 19, obtain the standard formula for \(SS\left( E \right)\):

\(SS\left( E \right) = {y^T}y - \hat \beta {X^T}y\)

Q20E

Page 331

Suppose \(A = QR\), where \(R\) is an invertible matrix. Showthat \(A\) and \(Q\) have the same column space.

Q20E

Page 331

In Exercises 19 and 20, all vectors are in \({\mathbb{R}^n}\). Mark each statement True or False. Justify each answer.

  1. \({\rm{u}} \cdot {\rm{v}} - {\rm{v}} \cdot {\rm{u}} = 0\).
  2. For any scalar \(c\), \(\left\| {c{\rm{v}}} \right\| = c\left\| {\rm{v}} \right\|\).
  3. If \({\rm{x}}\) is orthogonal to every vector in a subspace \(W\), then \({\rm{x}}\) is in \({W^ \bot }\).
  4. If \({\left\| {\rm{u}} \right\|^2} + {\left\| {\rm{v}} \right\|^2} = {\left\| {{\rm{u}} + {\rm{v}}} \right\|^2}\), then \({\rm{u and v}}\) are orthogonal.
  5. For an \(m \times n\) matrix \(A\), vectors in the null space of \(A\) are orthogonal to vectors in the row space of \(A\).

Q20E

Page 331

Let \(A\) be an \(m \times n\) matrix such that \({A^T}A\) is invertible. Show that the columns of \(A\) are linearly independent.

Q20E

Page 331

Let \({\rm{u}} = \left( \begin{array}{l}a\\b\end{array} \right)\)and\({\rm{v}} = \left( \begin{array}{l}1\\1\end{array} \right)\). Use the Cauchy鈥揝chwarz inequality to show that \({\left( {\frac{{a + b}}{2}} \right)^2} \le \frac{{{a^2} + {b^2}}}{2}\).

Q21E

Page 331

Given \(A = QR\) as in Theorem 12, describe how to find an orthogonal\(m \times m\)(square) matrix \({Q_1}\) and an invertible \(n \times n\) upper triangular matrix \(R\) such that

\(A = {Q_1}\left[ {\begin{aligned}{{}{}}R\\0\end{aligned}} \right]\)

The MATLAB qr command supplies this 鈥渇ull鈥 QR factorization

when rank \(A = n\).

Q21E

Page 331

Let \(A\)be an\(m \times n\)matrix whose columns are linearly independent.

  1. Use Exercise 19 to show that \({A^T}A\) is an invertible matrix.
  2. Explain why\(A\)must have at least as many rows as columns.
  3. Determine the rank of \(A\).

Q21E

Page 331

Exercises 21鈥24 refer to \(V = C\left( {0,1} \right)\) with the inner product given by an integral, as in Example 7.

21. Compute \(\left\langle {f,g} \right\rangle \), where \(f\left( t \right) = 1 - 3{t^2}\)and \(g\left( t \right) = 1 - {t^3}\).

Access millions of textbook solutions in one place

  • Access over 3 million high quality textbook solutions
  • Access our popular flashcard, quiz, mock-exam and notes features
  • Access our smart AI features to upgrade your learning
Access millions of textbook solutions in one place

Recommended explanations on Math Textbooks