/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q17SE Let \(A{\bf{ = }}\left( {\begin{... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(A{\bf{ = }}\left( {\begin{aligned}{*{20}{c}}{{a_{{\bf{11}}}}}&{{a_{{\bf{12}}}}}\\{{a_{{\bf{21}}}}}&{{a_{{\bf{22}}}}}\end{aligned}} \right)\). Recall from Exercise \({\bf{25}}\) in Section \({\bf{5}}{\bf{.4}}\) that \({\rm{tr}}\;A\) (the trace of \(A\)) is the sum of the diagonal entries in \(A\). Show that the characteristic polynomial of \(A\) is \({\lambda ^2} - \left( {{\rm{tr}}A} \right)\lambda + \det A\). Then show that the eigenvalues of a \({\bf{2 \times 2}}\) matrix \(A\) are both real if and only if \(\det A \le {\left( {\frac{{{\rm{tr}}A}}{2}} \right)^2}\).

Short Answer

Expert verified

It is proved that \(\det \left( {A - \lambda I} \right) = {\lambda ^2} - \left( {{\rm{tr}}A} \right)\lambda + \det A\). It is also proved that the eigenvalues of a \(2 \times 2\) matrix \(A\) are both real if and only if \(\det A \le {\left( {\frac{{trA}}{2}} \right)^2}\).

Step by step solution

01

Step 1: Find the characteristic polynomial.

Determine\(\det \left( {A - \lambda I} \right)\).

\(\begin{aligned}{c}\det \left( {A - \lambda I} \right) &= \det \left( {\begin{aligned}{*{20}{c}}{{a_{11}} - \lambda }&{{a_{12}}}\\{{a_{21}}}&{{a_{22}} - \lambda }\end{aligned}} \right)\\ &= \left( {{a_{11}} - \lambda } \right)\left( {{a_{22}} - \lambda } \right) - {a_{12}}{a_{21}}\\ &= {\lambda ^2} - {a_{11}}\lambda - {a_{22}}\lambda + {a_{11}}{a_{22}} - {a_{12}}{a_{21}}\\ &= {\lambda ^2} - \left( {{a_{11}} + {a_{22}}} \right)\lambda + \left( {{a_{11}}{a_{22}} - {a_{12}}{a_{21}}} \right)\end{aligned}\)

Now use a trace of \(A\), that is, \({\rm{tr}}\;A = {a_{11}} + {a_{22}}\) then we have,

\(\det \left( {A - \lambda I} \right) = {\lambda ^2} - \left( {trA} \right)\lambda + \det A\)

Hence it is proved that\(\det \left( {A - \lambda I} \right) = {\lambda ^2} - \left( {{\rm{tr}}A} \right)\lambda + \det A\).

02

Step 2: Show that the eigenvalues of a \({\bf{2 \times 2}}\) matrix \(A\) are both real if and only if \(\det A \le {\left( {\frac{{{\rm{tr}}A}}{2}} \right)^2}\)

Determine the eigenvalues of \(A\).

\(\begin{aligned}{c}\det \left( {A - \lambda I} \right) &= 0\\{\lambda ^2} - \left( {{\rm{tr}}\;A} \right)\lambda + \det A &= 0\\\lambda &= \frac{{tr\;A \pm \sqrt {{{\left( {tr\;A} \right)}^2} - 4\det A} }}{2}\end{aligned}\)

As eigenvalues are positive if the discriminant is positive.

\(\begin{aligned}{c}{\left( {{\rm{tr}}A} \right)^2} - 4\det A \ge 0\\{\left( {{\rm{tr}}A} \right)^2} \ge 4\det A\\\frac{{{{\left( {{\rm{tr}}A} \right)}^2}}}{4} \ge \det A\\{\left( {\frac{{{\rm{tr}}A}}{2}} \right)^2} \ge \det A\end{aligned}\)

Thus, \(\det A \le {\left( {\frac{{trA}}{2}} \right)^2}\).

Hence it is proved that the eigenvalues of a \(2 \times 2\) matrix \(A\) are both real if and only if \(\det A \le {\left( {\frac{{trA}}{2}} \right)^2}\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose \(A\) is diagonalizable and \(p\left( t \right)\) is the characteristic polynomial of \(A\). Define \(p\left( A \right)\) as in Exercise 5, and show that \(p\left( A \right)\) is the zero matrix. This fact, which is also true for any square matrix, is called the Cayley-Hamilton theorem.

Exercises 19–23 concern the polynomial \(p\left( t \right) = {a_{\bf{0}}} + {a_{\bf{1}}}t + ... + {a_{n - {\bf{1}}}}{t^{n - {\bf{1}}}} + {t^n}\) and \(n \times n\) matrix \({C_p}\) called the companion matrix of \(p\): \({C_p} = \left[ {\begin{aligned}{*{20}{c}}{\bf{0}}&{\bf{1}}&{\bf{0}}&{...}&{\bf{0}}\\{\bf{0}}&{\bf{0}}&{\bf{1}}&{}&{\bf{0}}\\:&{}&{}&{}&:\\{\bf{0}}&{\bf{0}}&{\bf{0}}&{}&{\bf{1}}\\{ - {a_{\bf{0}}}}&{ - {a_{\bf{1}}}}&{ - {a_{\bf{2}}}}&{...}&{ - {a_{n - {\bf{1}}}}}\end{aligned}} \right]\).

22. Let \(p\left( t \right) = {a_{\bf{0}}} + {a_{\bf{1}}}t + {a_{\bf{2}}}{t^{\bf{2}}} + {t^{\bf{3}}}\), and let \(\lambda \) be a zero of \(p\).

  1. Write the companion matrix for \(p\).
  2. Explain why \({\lambda ^{\bf{3}}} = - {a_{\bf{0}}} - {a_{\bf{1}}}\lambda - {a_{\bf{2}}}{\lambda ^{\bf{2}}}\), and show that \(\left( {{\bf{1}},\lambda ,{\lambda ^2}} \right)\) is an eigenvector of the companion matrix for \(p\).

a. Let \(A\) be a diagonalizable \(n \times n\) matrix. Show that if the multiplicity of an eigenvalue \(\lambda \) is \(n\), then \(A = \lambda I\).

b. Use part (a) to show that the matrix \(A =\left({\begin{aligned}{*{20}{l}}3&1\\0&3\end{aligned}}\right)\) is not diagonalizable.

Let\(T:{{\rm P}_2} \to {{\rm P}_3}\) be a linear transformation that maps a polynomial \({\bf{p}}\left( t \right)\) into the polynomial \(\left( {t + 5} \right){\bf{p}}\left( t \right)\).

  1. Find the image of\({\bf{p}}\left( t \right) = 2 - t + {t^2}\).
  2. Show that \(T\) is a linear transformation.
  3. Find the matrix for \(T\) relative to the bases \(\left\{ {1,t,{t^2}} \right\}\) and \(\left\{ {1,t,{t^2},{t^3}} \right\}\).

Question: Let \(A = \left( {\begin{array}{*{20}{c}}{.5}&{.2}&{.3}\\{.3}&{.8}&{.3}\\{.2}&0&{.4}\end{array}} \right)\), \({{\rm{v}}_1} = \left( {\begin{array}{*{20}{c}}{.3}\\{.6}\\{.1}\end{array}} \right)\), \({{\rm{v}}_2} = \left( {\begin{array}{*{20}{c}}1\\{ - 3}\\2\end{array}} \right)\), \({{\rm{v}}_3} = \left( {\begin{array}{*{20}{c}}{ - 1}\\0\\1\end{array}} \right)\) and \({\rm{w}} = \left( {\begin{array}{*{20}{c}}1\\1\\1\end{array}} \right)\).

  1. Show that \({{\rm{v}}_1}\), \({{\rm{v}}_2}\), and \({{\rm{v}}_3}\) are eigenvectors of \(A\). (Note: \(A\) is the stochastic matrix studied in Example 3 of Section 4.9.)
  2. Let \({{\rm{x}}_0}\) be any vector in \({\mathbb{R}^3}\) with non-negative entries whose sum is 1. (In section 4.9, \({{\rm{x}}_0}\) was called a probability vector.) Explain why there are constants \({c_1}\), \({c_2}\), and \({c_3}\) such that \({{\rm{x}}_0} = {c_1}{{\rm{v}}_1} + {c_2}{{\rm{v}}_2} + {c_3}{{\rm{v}}_3}\). Compute \({{\rm{w}}^T}{{\rm{x}}_0}\), and deduce that \({c_1} = 1\).
  3. For \(k = 1,2, \ldots ,\) define \({{\rm{x}}_k} = {A^k}{{\rm{x}}_0}\), with \({{\rm{x}}_0}\) as in part (b). Show that \({{\rm{x}}_k} \to {{\rm{v}}_1}\) as \(k\) increases.
See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.