/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q2.4-23Q Use partitioned matrices to prov... [FREE SOLUTION] | 91影视

91影视

Use partitioned matrices to prove by induction that the product of two lower triangular matrices is also lower triangular. [Hint: \(A\left( {k + 1} \right) \times \left( {k + 1} \right)\) matrix \({A_1}\) can be written in the form below, where \[a\] is a scalar, v is in \({\mathbb{R}^k}\), and Ais a \(k \times k\) lower triangular matrix. See the study guide for help with induction.]

\({A_1} = \left[ {\begin{array}{*{20}{c}}a&{{0^T}}\\0&A\end{array}} \right]\).

Short Answer

Expert verified

It is proved that by induction, the product of two lower triangular matrices is also lower triangular.

Step by step solution

01

Show the product of two lower triangular matrices

It is given that the product of two \(1 \times 1\) lower triangular matrices is lower triangular.

Suppose for \({\mathop{\rm n}\nolimits} = k\), the product of two \(k \times k\) lower triangular matrices is also lower triangular and take \({A_1}\) and \({B_1}\)as \(\left( {k + 1} \right) \times \left( {k + 1} \right)\) matrices. The partition matrices are in the form

\({A_1} = \left[ {\begin{array}{*{20}{c}}a&{{0^T}}\\0&A\end{array}} \right],{B_1} = \left[ {\begin{array}{*{20}{c}}b&{{0^T}}\\0&A\end{array}} \right]\).

Here v and w are in \({\mathbb{R}^k}\); \(A\) and B are \(k \times k\) lower triangular matrices, and \(a\), \(b\) are scalars.

02

Show by induction that the product of two lower triangular matrices is lower triangular

\(A\)and \(B\) must be lower triangular matrices since \({A_1}\) and \({B_1}\) are lower triangular.

\(\begin{array}{c}{A_1}{B_1} = \left[ {\begin{array}{*{20}{c}}a&{{0^T}}\\{\mathop{\rm v}\nolimits} &B\end{array}} \right]\left[ {\begin{array}{*{20}{c}}b&{{0^T}}\\{\mathop{\rm v}\nolimits} &B\end{array}} \right]\\ = \left[ {\begin{array}{*{20}{c}}{ab + {0^T}{\mathop{\rm w}\nolimits} }&{a{0^T} + {0^T}B}\\{{\mathop{\rm v}\nolimits} b + A{\mathop{\rm w}\nolimits} }&{{\mathop{\rm v}\nolimits} {0^T} + AB}\end{array}} \right]\\ = \left[ {\begin{array}{*{20}{c}}{ab}&{{0^T}}\\{b{\mathop{\rm v}\nolimits} + A{\mathop{\rm w}\nolimits} }&{AB}\end{array}} \right]\end{array}\)

ABis a lower triangular matrix because A and B are \(k \times k\) lower triangular matrices. The form \({A_1}{B_1}\) also indicates lower triangular. Therefore, the statement that lower triangular matrices hold for \(n = k + 1\) is true when \(n = k\).

According to the principle of induction, the statement is true for all \(n \ge 1\).

Thus, it is proved that by induction, the product of two lower triangular matrices is also lower triangular.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In Exercises 1鈥9, assume that the matrices are partitioned conformably for block multiplication. Compute the products shown in Exercises 1鈥4.

2. \[\left[ {\begin{array}{*{20}{c}}E&{\bf{0}}\\{\bf{0}}&F\end{array}} \right]\left[ {\begin{array}{*{20}{c}}A&B\\C&D\end{array}} \right]\]

Let \(X\) be \(m \times n\) data matrix such that \({X^T}X\) is invertible., and let \(M = {I_m} - X{\left( {{X^T}X} \right)^{ - {\bf{1}}}}{X^T}\). Add a column \({x_{\bf{0}}}\) to the data and form

\(W = \left[ {\begin{array}{*{20}{c}}X&{{x_{\bf{0}}}}\end{array}} \right]\)

Compute \({W^T}W\). The \(\left( {{\bf{1}},{\bf{1}}} \right)\) entry is \({X^T}X\). Show that the Schur complement (Exercise 15) of \({X^T}X\) can be written in the form \({\bf{x}}_{\bf{0}}^TM{{\bf{x}}_{\bf{0}}}\). It can be shown that the quantity \({\left( {{\bf{x}}_{\bf{0}}^TM{{\bf{x}}_{\bf{0}}}} \right)^{ - {\bf{1}}}}\) is the \(\left( {{\bf{2}},{\bf{2}}} \right)\)-entry in \({\left( {{W^T}W} \right)^{ - {\bf{1}}}}\). This entry has a useful statistical interpretation, under appropriate hypotheses.

In the study of engineering control of physical systems, a standard set of differential equations is transformed by Laplace transforms into the following system of linear equations:

\(\left[ {\begin{array}{*{20}{c}}{A - s{I_n}}&B\\C&{{I_m}}\end{array}} \right]\left[ {\begin{array}{*{20}{c}}{\bf{x}}\\{\bf{u}}\end{array}} \right] = \left[ {\begin{array}{*{20}{c}}{\bf{0}}\\{\bf{y}}\end{array}} \right]\)

Where \(A\) is \(n \times n\), \(B\) is \(n \times m\), \(C\) is \(m \times n\), and \(s\) is a variable. The vector \({\bf{u}}\) in \({\mathbb{R}^m}\) is the 鈥渋nput鈥 to the system, \({\bf{y}}\) in \({\mathbb{R}^m}\) is the 鈥渙utput鈥 and \({\bf{x}}\) in \({\mathbb{R}^n}\) is the 鈥渟tate鈥 vector. (Actually, the vectors \({\bf{x}}\), \({\bf{u}}\) and \({\bf{v}}\) are functions of \(s\), but we suppress this fact because it does not affect the algebraic calculations in Exercises 19 and 20.)

Show that if the columns of Bare linearly dependent, then so are the columns of AB.

Let \(T:{\mathbb{R}^n} \to {\mathbb{R}^n}\) be an invertible linear transformation. Explain why T is both one-to-one and onto \({\mathbb{R}^n}\). Use equations (1) and (2). Then give a second explanation using one or more theorems.

Show that \({I_n}A = A\) when \(A\) is \(m \times n\) matrix. (Hint: Use the (column) definition of \({I_n}A\).)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.