/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q-7.3-13E Let A be an \(n \times n\) symme... [FREE SOLUTION] | 91影视

91影视

Let A be an \(n \times n\) symmetric matrix, let M and \(m\) denote the maximum and minimum values of the quadratic form \({{\bf{x}}^T}A{\bf{x}}\), where \({{\bf{x}}^T}{\bf{x}} = 1\), and denote corresponding unit eigenvectors by \({{\bf{u}}_1}\) and \({{\bf{u}}_n}\). The following calculations show that given any number tbetween \(M\) and \(m\), there is a unit vector x such that \(t = {{\bf{x}}^T}A{\bf{x}}\). Verify that \(t = \left( {1 - \alpha } \right)m + \alpha M\) for some number \(\alpha \) between 0 and 1. Then, let \({\mathop{\rm x}\nolimits} = \sqrt {1 - \alpha } {{\bf{u}}_n} + \sqrt \alpha {{\bf{u}}_1}\), and show that \({{\bf{x}}^T}{\bf{x}} = 1\) and \({{\bf{x}}^T}A{\bf{x}} = t\).

Short Answer

Expert verified

It is proved that \({{\bf{x}}^T}{\bf{x}} = t\) and \({{\bf{x}}^T}A{\bf{x}} = t\).

Step by step solution

01

Statement in Theorem 6

Theorem 6states that consider \(A\) as a symmetric matrixand \(m = \min \left\{ {{{\bf{x}}^T}A{\bf{x}}:\left\| {\bf{x}} \right\| = 1} \right\},M = \max \left\{ {{{\bf{x}}^T}A{\bf{x}}:\left\| {\bf{x}} \right\| = 1} \right\}\). Then the greatest eigenvalue \({\lambda _1}\) of A is \(M\) and the least eigenvalue of \(A\) is \(m\). \(M\) is the value of \({{\bf{x}}^T}A{\bf{x}}\) if the unit eigenvalue of \({{\bf{u}}_1}\) is \({\bf{x}}\) that corresponds to \(M\). \(m\) is the value of \({{\bf{x}}^T}A{\bf{x}}\) if the unit eigenvalue of \({{\bf{u}}_1}\) is \({\bf{x}}\) that corresponds to \(m\).

02

Show that \({{\bf{x}}^T}{\bf{x}} = 1\) and \({{\bf{x}}^T}A{\bf{x}} = t\) 

The Pythagoras theoremstates that the vectors \({\bf{u}}\) and \({\bf{v}}\) are said to beorthogonal such that if \({\left\| {{\bf{u}} + {\bf{v}}} \right\|^2} = {\left\| {\bf{u}} \right\|^2} + {\left\| {\bf{v}} \right\|^2}\).

When \(m = M\) then consider that \(t = \left( {1 - 0} \right)m + 0M = m\) and \({\bf{x}} = {{\bf{u}}_n}\). Theorem 6 demonstrates that \({{\bf{u}}^T}A{{\bf{u}}_n} = m\).

Assume that \(m < M,\) and consider that \(t\) is between \(m\) and \(M\). Then \(0 \le t - m \le M - m\) and \(0 \le \left( {t - m} \right)\left( {M - m} \right) \le 1\).

Consider that \(\alpha = \frac{{\left( {t - m} \right)}}{{\left( {M - m} \right)}}\) and \({\bf{x}} = \sqrt {1 - \alpha } {{\bf{u}}_n} + \sqrt \alpha {{\bf{u}}_1}\). The vectors \(\sqrt {1 - \alpha } {{\bf{u}}_n}\) and \(\sqrt \alpha {{\bf{u}}_1}\) are orthogonal since the eigenvectors have different eigenvalues (or one of them is 0). According to the Pythagoras theorem

\(\begin{array}{c}{{\bf{x}}^T}{\bf{x}} = {\left\| {\bf{x}} \right\|^2}\\ = {\left\| {\sqrt {1 - \alpha } {{\bf{u}}_n}} \right\|^2} + {\left\| {\sqrt \alpha {{\bf{u}}_1}} \right\|^2}\\ = \left| {1 - \alpha } \right|{\left\| {{{\bf{u}}_n}} \right\|^2} + \left| \alpha \right|{\left\| {{{\bf{u}}_1}} \right\|^2}\\ = \left( {1 - \alpha } \right) + \alpha \\ = 1\end{array}\)

Because \({{\bf{u}}_n}\) and \({{\bf{u}}_1}\) are unit vectors and \(0 \le \alpha \le 1\). Moreover, \({{\bf{u}}_n}\) and \({{\bf{u}}_1}\) are orthogonal, then

\(\begin{array}{c}{{\bf{x}}^T}A{\bf{x}} = {\left( {\sqrt {1 - \alpha } {{\bf{u}}_n} + \sqrt \alpha {{\bf{u}}_1}} \right)^T}A\left( {\sqrt {1 - \alpha } {{\bf{u}}_n} + \sqrt \alpha {{\bf{u}}_1}} \right)\\ = {\left( {\sqrt {1 - \alpha } {{\bf{u}}_n} + \sqrt \alpha {{\bf{u}}_1}} \right)^T}\left( {m\sqrt {1 - \alpha } {{\bf{u}}_n} + M\sqrt \alpha {{\bf{u}}_1}} \right)\\ = \left| {1 - \alpha } \right|m{\bf{u}}_n^T{{\bf{u}}_n} + \left| \alpha \right|M{\bf{u}}_1^T{{\bf{u}}_1}\\ = \left( {1 - \alpha } \right)m + \alpha M\\ = t\end{array}\)

Therefore, the quadratic form \({{\bf{x}}^T}A{\bf{x}}\) takes all values between \(m\) and \(M\) for a relevant unit vector \(x\).

Thus, it is proved that \({{\bf{x}}^T}{\bf{x}} = t\) and \({{\bf{x}}^T}A{\bf{x}} = t\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: 13. The sample covariance matrix is a generalization of a formula for the variance of a sample of \(N\) scalar measurements, say \({t_1},................,{t_N}\). If \(m\) is the average of \({t_1},................,{t_N}\), then the sample variance is given by

\(\frac{1}{{N - 1}}\sum\limits_{k = 1}^n {{{\left( {{t_k} - m} \right)}^2}} \)

Show how the sample covariance matrix, \(S\), defined prior to Example 3, may be written in a form similar to (1). (Hint: Use partitioned matrix multiplication to write \(S\) as \(\frac{1}{{N - 1}}\) times the sum of \(N\) matrices of size \(p \times p\). For \(1 \le k \le N\), write \({X_k} - M\) in place of \({\hat X_k}\).)

In Exercises 17鈥24, \(A\) is an \(m \times n\) matrix with a singular value decomposition \(A = U\Sigma {V^T}\) , where \(U\) is an \(m \times m\) orthogonal matrix, \({\bf{\Sigma }}\) is an \(m \times n\) 鈥渄iagonal鈥 matrix with \(r\) positive entries and no negative entries, and \(V\) is an \(n \times n\) orthogonal matrix. Justify each answer.

19. Show that the columns of\(V\)are eigenvectors of\({A^T}A\), the columns of\(U\)are eigenvectors of\(A{A^T}\), and the diagonal entries of\({\bf{\Sigma }}\)are the singular values of \(A\). (Hint: Use the SVD to compute \({A^T}A\) and \(A{A^T}\).)

Question: 11. Given multivariate data \({X_1},................,{X_N}\) (in \({\mathbb{R}^p}\)) in mean deviation form, let \(P\) be a \(p \times p\) matrix, and define \({Y_k} = {P^T}{X_k}{\rm{ for }}k = 1,......,N\).

  1. Show that \({Y_1},................,{Y_N}\) are in mean-deviation form. (Hint: Let \(w\) be the vector in \({\mathbb{R}^N}\) with a 1 in each entry. Then \(\left( {{X_1},................,{X_N}} \right)w = 0\) (the zero vector in \({\mathbb{R}^p}\)).)
  2. Show that if the covariance matrix of \({X_1},................,{X_N}\) is \(S\), then the covariance matrix of \({Y_1},................,{Y_N}\) is \({P^T}SP\).

Find the matrix of the quadratic form. Assume x is in \({\mathbb{R}^{\bf{3}}}\).

a. \(3x_1^2 - 2x_2^2 + 5x_3^2 + 4{x_1}{x_2} - 6{x_1}{x_3}\)

b. \(4x_3^2 - 2{x_1}{x_2} + 4{x_2}{x_3}\)

(M) Orhtogonally diagonalize the matrices in Exercises 37-40. To practice the methods of this section, do not use an eigenvector routine from your matrix program. Instead, use the program to find the eigenvalues, and for each eigenvalue \(\lambda \), find an orthogonal basis for \({\bf{Nul}}\left( {A - \lambda I} \right)\), as in Examples 2 and 3.

38. \(\left( {\begin{aligned}{{}}{.{\bf{63}}}&{ - .{\bf{18}}}&{ - .{\bf{06}}}&{ - .{\bf{04}}}\\{ - .{\bf{18}}}&{.{\bf{84}}}&{ - .{\bf{04}}}&{.{\bf{12}}}\\{ - .{\bf{06}}}&{ - .{\bf{04}}}&{.{\bf{72}}}&{ - .{\bf{12}}}\\{ - .{\bf{04}}}&{.{\bf{12}}}&{ - .{\bf{12}}}&{.{\bf{66}}}\end{aligned}} \right)\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.