/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q26E Let \({\bf{u}} = \left( {\begin{... [FREE SOLUTION] | 91影视

91影视

Let \({\bf{u}} = \left( {\begin{aligned}5\\{ - 6}\\7\end{aligned}} \right)\), and let \(W\) be the set of all \({\bf{x}}\) in \({\mathbb{R}^3}\) such that \({\bf{u}} \cdot {\bf{x}} = 0\). What theorem in Chapter 4 can be used to show that \(W\) is a subspace of \({\mathbb{R}^3}\)? Describe \(W\) in geometric language.

Short Answer

Expert verified

The theorem that can be used in chapter 4 is theorem 2. And geometrically, \(W\) is a plane through the origin.

Step by step solution

01

Definition of Orthogonal sets

The two vectors \({\bf{u}}{\rm{ and }}{\bf{v}}\) are Orthogonal if:

\(\begin{aligned}{l}{\left\| {{\bf{u}} + {\bf{v}}} \right\|^2} = {\left\| {\bf{u}} \right\|^2} + {\left\| {\bf{v}} \right\|^2}\\{\rm{and}}\\{\bf{u}} \cdot {\bf{v}} = 0\end{aligned}\).

02

Check whether \(W\) is a subspace of \({\mathbb{R}^3}\) or not

The given vector is, \({\bf{u}} = \left( {\begin{aligned}{*{20}{c}}5\\{ - 6}\\7\end{aligned}} \right)\) and \(W = \left\{ {x \in {\mathbb{R}^3}|{\bf{u}} \cdot {\bf{x}} = 0} \right\}\).

Since is a null space of the \(1 \times 3\) matrix \({{\bf{u}}^T}\).

Therefore, Theorem 2 can be used to verify that \(W\) is a subspace of \({\mathbb{R}^3}\), which is possible only, if \({\bf{u}} \cdot {\bf{x}} = 0\) or \({{\bf{u}}^T} \cdot {\bf{x}} = 0\), this shows that \(W\) is a null-pace of \({{\bf{u}}^T}\). Hence \(W\) is a subspace of \({\mathbb{R}^3}\).

03

Define geometrically

As \(W\) has all the vectors which are perpendicular to \({\bf{u}}\). So find \({\bf{u}} \cdot {\bf{x}} = 0\) by letting \({\bf{x}} = \left( {\begin{aligned}{*{20}{c}}{{x_1}}\\{{x_2}}\\{{x_3}}\end{aligned}} \right)\).

\(\begin{aligned}{c}\left( {\begin{aligned}{*{20}{c}}5\\{ - 6}\\7\end{aligned}} \right) \cdot \left( {\begin{aligned}{*{20}{c}}{{x_1}}\\{{x_2}}\\{{x_3}}\end{aligned}} \right) = 0\\5{x_1} - 6{x_2} + 7{x_3} = 0\end{aligned}\)

So geometrically, the subspace \(W\) is a plane passing through the origin.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In Exercises 5 and 6, describe all least squares solutions of the equation \(A{\bf{x}} = {\bf{b}}\).

5. \(A = \left( {\begin{aligned}{{}{}}{\bf{1}}&{\bf{1}}&{\bf{0}}\\{\bf{1}}&{\bf{1}}&{\bf{0}}\\{\bf{1}}&{\bf{0}}&{\bf{1}}\\{\bf{1}}&{\bf{0}}&{\bf{1}}\end{aligned}} \right)\), \({\bf{b}} = \left( {\begin{aligned}{{}{}}{\bf{1}}\\{\bf{3}}\\{\bf{8}}\\{\bf{2}}\end{aligned}} \right)\)

Given \(A = QR\) as in Theorem 12, describe how to find an orthogonal\(m \times m\)(square) matrix \({Q_1}\) and an invertible \(n \times n\) upper triangular matrix \(R\) such that

\(A = {Q_1}\left[ {\begin{aligned}{{}{}}R\\0\end{aligned}} \right]\)

The MATLAB qr command supplies this 鈥渇ull鈥 QR factorization

when rank \(A = n\).

Given data for a least-squares problem, \(\left( {{x_1},{y_1}} \right), \ldots ,\left( {{x_n},{y_n}} \right)\), the following abbreviations are helpful:

\(\begin{aligned}{l}\sum x = \sum\nolimits_{i = 1}^n {{x_i}} ,{\rm{ }}\sum {{x^2}} = \sum\nolimits_{i = 1}^n {x_i^2} ,\\\sum y = \sum\nolimits_{i = 1}^n {{y_i}} ,{\rm{ }}\sum {xy} = \sum\nolimits_{i = 1}^n {{x_i}{y_i}} \end{aligned}\)

The normal equations for a least-squares line \(y = {\hat \beta _0} + {\hat \beta _1}x\)may be written in the form

\(\begin{aligned}{{\hat \beta }_0} + {{\hat \beta }_1}\sum x = \sum y \\{{\hat \beta }_0}\sum x + {{\hat \beta }_1}\sum {{x^2}} = \sum {xy} {\rm{ (7)}}\end{aligned}\)

16. Use a matrix inverse to solve the system of equations in (7) and thereby obtain formulas for \({\hat \beta _0}\) , and that appear in many statistics texts.

A certain experiment produces the data \(\left( {1,1.8} \right),\left( {2,2.7} \right),\left( {3,3.4} \right),\left( {4,3.8} \right),\left( {5,3.9} \right)\). Describe the model that produces a least-squares fit of these points by a function of the form

\(y = {\beta _1}x + {\beta _2}{x^2}\)

Such a function might arise, for example, as the revenue from the sale of \(x\) units of a product, when the amount offered for sale affects the price to be set for the product.

a. Give the design matrix, the observation vector, and the unknown parameter vector.

b. Find the associated least-squares curve for the data.

In Exercises 7鈥10, let\[W\]be the subspace spanned by the\[{\bf{u}}\]鈥檚, and write y as the sum of a vector in\[W\]and a vector orthogonal to\[W\].

9.\[y = \left[ {\begin{aligned}4\\3\\3\\{ - 1}\end{aligned}} \right]\],\[{{\bf{u}}_1} = \left[ {\begin{aligned}1\\1\\0\\1\end{aligned}} \right]\],\[{{\bf{u}}_2} = \left[ {\begin{aligned}{ - 1}\\3\\1\\{ - 2}\end{aligned}} \right]\],\[{{\bf{u}}_2} = \left[ {\begin{aligned}{ - 1}\\0\\1\\1\end{aligned}} \right]\]

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.