Chapter 6: Problem 8
Let \(A\) be an Hermitian matrix and let \(B=i A .\) Show that \(B\) is skew Hermitian.
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 6: Problem 8
Let \(A\) be an Hermitian matrix and let \(B=i A .\) Show that \(B\) is skew Hermitian.
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
Show that \\[ \langle\mathbf{z}, \mathbf{w}\rangle=\mathbf{w}^{H} \mathbf{z} \\] defines an inner product on \(\mathbb{C}^{n}\)
Let \(\left\\{\mathbf{u}_{1}, \mathbf{u}_{2}\right\\}\) be an orthonormal basis for \(\mathbb{C}^{2},\) and let \(\mathbf{z}=(4+2 i) \mathbf{u}_{1}+(6-5 i) \mathbf{u}_{2}\) (a) What are the values of \(\mathbf{u}_{1}^{H} \mathbf{z}, \mathbf{z}^{H} \mathbf{u}_{1}, \mathbf{u}_{2}^{H} \mathbf{z},\) and \(\mathbf{z}^{H} \mathbf{u}_{2} ?\) (b) Determine the value of \(\|\mathbf{z}\|\)
Show that if \(A\) is a symmetric positive definite matrix, then \(A\) is nonsingular and \(A^{-1}\) is also positive definite.
Let \(A\) be a Hermitian matrix with eigenvalues \(\lambda_{1} \geq\) \(\lambda_{2} \geq \cdots \geq \lambda_{n}\) and orthonormal eigenvectors \(\mathbf{u}_{1}, \ldots, \mathbf{u}_{n} .\) For any nonzero vector \(\mathbf{x}\) in \(\mathbb{R}^{n},\) the Rayleigh quotient \(\rho(\mathbf{x})\) is defined by \\[ \rho(\mathbf{x})=\frac{\langle A \mathbf{x}, \mathbf{x}\rangle}{\langle\mathbf{x}, \mathbf{x}\rangle}=\frac{\mathbf{x}^{H} A \mathbf{x}}{\mathbf{x}^{H} \mathbf{x}} \\] (a) If \(\mathbf{x}=c_{1} \mathbf{u}_{1}+\cdots+c_{n} \mathbf{u}_{n},\) show that \\[ \rho(\mathbf{x})=\frac{\left|c_{1}\right|^{2} \lambda_{1}+\left|c_{2}\right|^{2} \lambda_{2}+\cdots+\left|c_{n}\right|^{2} \lambda_{n}}{\|\mathbf{c}\|^{2}} \\] (b) Show that \\[ \lambda_{n} \leq \rho(\mathbf{x}) \leq \lambda_{1} \\] (c) Show that \\[ \max _{\mathbf{X} \neq 0} \rho(\mathbf{x})=\lambda_{1} \quad \text { and } \quad \min _{\mathbf{X} \neq 0} \rho(\mathbf{x})=\lambda_{n} \\]
Let \\[ A=\left(\begin{array}{ll} 0 & 1 \\ 1 & 0 \end{array}\right) \\] Write \(A\) as a sum \(\lambda_{1} \mathbf{u}_{1} \mathbf{u}_{1}^{T}+\lambda_{2} \mathbf{u}_{2} \mathbf{u}_{2}^{T},\) where \(\lambda_{1}\) and \(\lambda_{2}\) are eigenvalues and \(\mathbf{u}_{1}\) and \(\mathbf{u}_{2}\) are orthonormal eigenvectors.
What do you think about this solution?
We value your feedback to improve our textbook solutions.