/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q7.4-21E In Exercises 17鈥24, \(A\) is a... [FREE SOLUTION] | 91影视

91影视

In Exercises 17鈥24, \(A\) is an \(m \times n\) matrix with a singular value decomposition \(A = U\Sigma {V^T}\) , where \(U\) is an \(m \times m\) orthogonal matrix, \({\bf{\Sigma }}\) is an \(m \times n\) 鈥渄iagonal鈥 matrix with \(r\) positive entries and no negative entries, and \(V\) is an \(n \times n\) orthogonal matrix. Justify each answer.

21. Justify the statement in Example 2 that the second singular value of a matrix \(A\) is the maximum of \(\left\| {A{\bf{x}}} \right\|\) as \({\bf{x}}\) varies over all unit vectors orthogonal to \({{\bf{v}}_{\bf{1}}}\), with \({{\bf{v}}_{\bf{1}}}\) a right singular vector corresponding to the first singular value of \(A\). (Hint: Use Theorem 7 in Section 7.3.)

Short Answer

Expert verified

The given statement is verified.

Step by step solution

01

Show that vectors orthogonal to \({{\bf{v}}_{\bf{1}}}\)

As we know that the right singular vector \({{\bf{v}}_{\bf{1}}}\) is an eigenvector for the largest eigenvalue \({\lambda _1}\) of \({A^T}A\). Also, the second largest eigenvalue \({\lambda _2}\) is the maximum of \({{\bf{x}}^T}\left( {{A^T}A} \right){\bf{x}}\) overall unit vectors orthogonal to \({{\bf{v}}_{\bf{1}}}\) .

02

Show that overall unit vectors orthogonal to \({{\bf{v}}_{\bf{1}}}\)

Find\({{\bf{x}}^T}({A^T}A){\bf{x}}\).

\(\begin{array}{c}{{\bf{x}}^T}({A^T}A){\bf{x}} = {{\bf{x}}^T}{A^T}(A{\bf{x}})\\ = {(A{\bf{x}})^T}(A{\bf{x}})\\ = ||A{\bf{x}}|{|^2}\end{array}\)

Thus, the square root of \({\lambda _2}\), is the maximum of \(||A{\bf{x}}||\) overall unit vectors orthogonal to \({{\bf{v}}_{\bf{1}}}\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: 2. Let \(\left\{ {{{\bf{u}}_1},{{\bf{u}}_2},....,{{\bf{u}}_n}} \right\}\) be an orthonormal basis for \({\mathbb{R}_n}\) , and let \({\lambda _1},....{\lambda _n}\) be any real scalars. Define

\(A = {\lambda _1}{{\bf{u}}_1}{\bf{u}}_1^T + ..... + {\lambda _n}{\bf{u}}_n^T\)

a. Show that A is symmetric.

b. Show that \({\lambda _1},....{\lambda _n}\) are the eigenvalues of A

In Exercises 17鈥24, \(A\) is an \(m \times n\) matrix with a singular value decomposition \(A = U\Sigma {V^T}\) , where \(U\) is an \(m \times m\) orthogonal matrix, \({\bf{\Sigma }}\) is an \(m \times n\) 鈥渄iagonal鈥 matrix with \(r\) positive entries and no negative entries, and \(V\) is an \(n \times n\) orthogonal matrix. Justify each answer.

20. Show that if\(A\)is an orthogonal\(m \times m\)matrix, then \(PA\) has the same singular values as \(A\).

Question: 3. Let A be an \(n \times n\) symmetric matrix of rank r. Explain why the spectral decomposition of A represents A as the sum of r rank 1 matrices.

Question: [M] A Landsat image with three spectral components was made of Homestead Air Force Base in Florida (after the base was hit by Hurricane Andrew in 1992). The covariance matrix of the data is shown below. Find the first principal component of the data, and compute the percentage of the total variance that is contained in this component.

\[S = \left[ {\begin{array}{*{20}{c}}{164.12}&{32.73}&{81.04}\\{32.73}&{539.44}&{249.13}\\{81.04}&{246.13}&{189.11}\end{array}} \right]\]

Question: 14. Exercises 12鈥14 concern an \(m \times n\) matrix \(A\) with a reduced singular value decomposition, \(A = {U_r}D{V_r}^T\), and the pseudoinverse \({A^ + } = {U_r}{D^{ - 1}}{V_r}^T\).

Given any \({\rm{b}}\) in \({\mathbb{R}^m}\), adapt Exercise 13 to show that \({A^ + }{\rm{b}}\) is the least-squares solution of minimum length. [Hint: Consider the equation \(A{\rm{x}} = {\rm{b}}\), where \(\mathop {\rm{b}}\limits^\^ \) is the orthogonal projection of \({\rm{b}}\) onto Col \(A\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.