/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 15 If a unitary matrix \(U\) is wri... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

If a unitary matrix \(U\) is written as \(A+i B\), where \(A\) and \(B\) are Hermitian with non-degenerate eigenvalues, show the following: (a) A and B commute; (b) \(\mathrm{A}^{2}+\mathrm{B}^{2}=\mathrm{l}\) (c) The eigenvectors of \(\mathrm{A}\) are also eigenvectors of \(\mathrm{B}\); (d) The eigenvalues of \(U\) have unit modulus (as is necessary for any unitary matrix).

Short Answer

Expert verified
A and B commute; A^2 + B^2 = I; A and B share eigenvectors; Eigenvalues of U have unit modulus.

Step by step solution

01

Show that A and B commute

Given that a unitary matrix \(U= A+i B\), where \(A\) and \(B\) are Hermitian matrices. Being unitary means \(U^\text{†} U = U U^\text{†} = I\). Hence, \[ (A - i B)(A + i B) = I \] Simplifying, we get \[ A^2 + i AB - i BA + B^2 = I \] Since \(A\) and \(B\) are Hermitian, \(AB\) and \(BA\) will also be Hermitian. For the imaginary terms to cancel out, \(AB = BA\). Therefore, \(A\) and \(B\) commute.
02

Prove that \(A^2 + B^2 = I\)

From the first step, we have \[ A^2 + i AB - i BA + B^2 = I \] Given \(AB = BA\), the equation simplifies to: \[ A^2 + B^2 = I \] Thus, \(A^2 + B^2 = I\).
03

Show the eigenvectors of A are also eigenvectors of B

Since \(A\) and \(B\) commute, they can be simultaneously diagonalized. Therefore, there exists a set of eigenvectors that diagonalize both \(A\) and \(B\). The eigenvectors of \(A\) are the same as those of \(B\).
04

Show the eigenvalues of \(U\) have unit modulus

A unitary matrix \(U\) satisfies \(U^\text{†} U = I\). If \(u\) is an eigenvalue of \(U\), then \[ U u = \lambda u \] Taking complex conjugate transpose, \[ U^\text{†} u^* = \lambda^* u^* \] Since \(U\) is unitary, \(\lambda^* \lambda = 1\). Hence, the eigenvalues \(\lambda\) of \(U\) have unit modulus.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

Key Concepts

These are the key concepts you need to understand to accurately answer the question.

Hermitian Matrices
Hermitian matrices, named after the French mathematician Charles Hermite, play a crucial role in quantum mechanics and linear algebra. A Hermitian matrix is a square matrix that is equal to its own conjugate transpose. Mathematically, a matrix \( A \) is Hermitian if \( A = A^\text{†} \).

Hermitian matrices have several important properties:
  • All eigenvalues of a Hermitian matrix are real.
  • The eigenvectors corresponding to distinct eigenvalues are orthogonal.
  • They are used to represent observable quantities in quantum mechanics, such as position and momentum.
Since matrices \( A \) and \( B \) in our problem are Hermitian, they are particularly well-behaved in terms of their eigenvalues and eigenvectors. This property is essential for proving that these matrices commute and have a particular relationship with the unitary matrix \( U \).
Commuting Matrices
Commuting matrices are matrices that can be multiplied in any order without affecting the result. In other words, matrices \( A \) and \( B \) commute if \( AB = BA \).

This property is important because for matrices to be simultaneously diagonalizable, they must commute. This means there exists a basis in which both matrices are diagonal.

In our exercise, showing that \( A \) and \( B \) commute relies on them being components of the unitary matrix \( U \). The equation \( (A - iB)(A + iB) = I \) simplifies to show that the imaginary parts cancel out only if \( AB = BA \). By proving that \( A \) and \( B \) commute, we can further establish relationships between their eigenvalues and eigenvectors.
Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors are fundamental concepts in linear algebra. If \( A \) is a matrix, an eigenvector \( v \) and corresponding eigenvalue \( \lambda \) satisfy the equation \( Av = \lambda v \).

Key points about eigenvalues and eigenvectors:
  • Eigenvalues can be real or complex.
  • Eigenvectors corresponding to different eigenvalues are linearly independent.
  • For Hermitian matrices, all eigenvalues are real, and eigenvectors corresponding to different eigenvalues are orthogonal.
In our exercise, since \( A \) and \( B \) commute and are Hermitian, they share the same set of eigenvectors. This is because commuting Hermitian matrices can be simultaneously diagonalized. Therefore, the eigenvectors of \( A \) are also eigenvectors of \( B \), allowing a simplification in their analysis.
Unit Modulus
The term 'unit modulus' indicates that a complex number has an absolute value (or magnitude) of 1. For a unitary matrix \( U \), every eigenvalue \( \lambda \) must satisfy \( |\lambda| = 1 \).

This property is derived from the condition that unitary matrices preserve the length of vectors. If \( U \) is unitary, then \( U^\text{†}U = I \). For any eigenvalue \( \lambda \) of \( U \), we can prove this by showing that the product of \( \lambda \) with its complex conjugate \( \lambda^* \) equals 1: \( \lambda^* \lambda = 1 \).

In our exercise, this unit modulus property guarantees that the eigenvalues of the unitary matrix \( U \) lie on the unit circle in the complex plane, ensuring the matrix behaves as expected in terms of preserving norms and other unitary characteristics.

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

The four matrices \(\mathrm{S}_{x}, \mathrm{~S}_{y}, \mathrm{~S}_{z}\) and \(\mathrm{I}\) are defined by $$ \begin{array}{ll} \mathrm{S}_{x}=\left(\begin{array}{cc} 0 & 1 \\ 1 & 0 \end{array}\right), & \mathrm{s}_{y}=\left(\begin{array}{cc} 0 & -i \\ i & 0 \end{array}\right) \\ \mathrm{S}_{z}=\left(\begin{array}{cc} 1 & 0 \\ 0 & -1 \end{array}\right), & \mathrm{I}=\left(\begin{array}{ll} 1 & 0 \\ 0 & 1 \end{array}\right) \end{array} $$ where \(i^{2}=-1\). Show that \(\mathrm{S}_{x}^{2}=1\) and \(\mathrm{S}_{x} \mathrm{~S}_{y}=i \mathrm{~S}_{z}\), and obtain similar results by permutting \(x, y\) and \(z\). Given that \(\mathbf{v}\) is a vector with Cartesian components \(\left(v_{x}, v_{y}, v_{z}\right)\), the matrix \(\mathrm{S}(\mathbf{v})\) is defined as $$ \mathrm{S}(\mathbf{v})=v_{x} \mathrm{~S}_{x}+v_{y} \mathrm{~S}_{y}+v_{z} \mathrm{~S}_{z} $$ Prove that, for general non-zero vectors a and \(\mathbf{b}\), $$ \mathrm{S}(\mathbf{a}) \mathrm{S}(\mathbf{b})=\mathbf{a} \cdot \mathbf{b} \mid+i \mathrm{~S}(\mathbf{a} \times \mathbf{b}) $$ Without further calculation, deduce that \(\mathrm{S}(\) a) and \(\mathrm{S}(\) b) commute if and only if a and \(\mathbf{b}\) are parallel vectors.

Show that the quadratic surface $$ 5 x^{2}+11 y^{2}+5 z^{2}-10 y z+2 x z-10 x y=4 $$ is an ellipsoid with semi-axes of lengths 2,1 and \(0.5 .\) Find the direction of its longest axis.

Find the equation satisfied by the squares of the singular values of the matrix associated with the following over-determined set of equations: $$ \begin{aligned} 2 x+3 y+z &=0 \\ x-y-z &=1 \\ 2 x+y &=0 \\ 2 y+z &=-2 \end{aligned} $$ Show that one of the singular values is close to zero. Determine the two larger singular values by an appropriate iteration process and the smallest one by indirect calculation.

Use the stationary properties of quadratic forms to determine the maximum and minimum values taken by the expression $$ Q=5 x^{2}+4 y^{2}+4 z^{2}+2 x z+2 x y $$ on the unit sphere, \(x^{2}+y^{2}+z^{2}=1\). For what values of \(x, y\) and \(z\) do they occur?

One method of determining the nullity (and hence the rank) of an \(M \times N\) matrix A is as follows. \- Write down an augmented transpose of \(\mathrm{A}\), by adding on the right an \(N \times N\) unit matrix and thus producing an \(N \times(M+N)\) array \(\mathrm{B}\). \- Subtract a suitable multiple of the first row of \(B\) from each of the other lower rows so as to make \(B_{i 1}=0\) for \(i>1\) \- Subtract a suitable multiple of the second row (or the uppermost row that does not start with \(M\) zero values) from each of the other lower rows so as to make \(B_{i 2}=0\) for \(i>2\) \- Continue in this way until all remaining rows have zeros in the first \(M\) places. The number of such rows is equal to the nullity of \(A\), and the \(N\) rightmost entries of these rows are the components of vectors that span the null space. They can be made orthogonal if they are not so already. Use this method to show that the nullity of $$ A=\left(\begin{array}{cccc} -1 & 3 & 2 & 7 \\ 3 & 10 & -6 & 17 \\ -1 & -2 & 2 & -3 \\ 2 & 3 & -4 & 4 \\ 4 & 0 & -8 & -4 \end{array}\right) $$ is 2 and that an orthogonal base for the null space of \(A\) is provided by any two column matrices of the form \(\left(2+\alpha_{i}-2 \alpha_{i} 1 \quad \alpha_{i}\right)^{\mathrm{T}}\), for which the \(\alpha_{i}(i=1,2)\) are real and satisfy \(6 \alpha_{1} \alpha_{2}+2\left(\alpha_{1}+\alpha_{2}\right)+5=0\).

See all solutions

Recommended explanations on Physics Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.