Chapter 6: Problem 31
Let \(A\) be a matrix whose columns all add up to a fixed constant \(\delta .\) Show that \(\delta\) is an eigenvalue of \(A\)
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 6: Problem 31
Let \(A\) be a matrix whose columns all add up to a fixed constant \(\delta .\) Show that \(\delta\) is an eigenvalue of \(A\)
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(A\) be an \(n \times n\) positive stochastic matrix with dominant eigenvalue \(\lambda_{1}=1\) and linearly independent eigenvectors \(\mathbf{x}_{1}, \mathbf{x}_{2}, \ldots, \mathbf{x}_{n},\) and let \(\mathbf{y}_{0}\) be an initial probability vector for a Markov chain \\[ \mathbf{y}_{0}, \mathbf{y}_{1}=A \mathbf{y}_{0}, \mathbf{y}_{2}=A \mathbf{y}_{1}, \dots \\] (a) Show that \(\lambda_{1}=1\) has a positive eigenvector \(\mathbf{x}_{1}\) (b) Show that \(\left\|\mathbf{y}_{j}\right\|_{1}=1, j=0,1, \ldots\) (c) Show that if \\[ \mathbf{y}_{0}=c_{1} \mathbf{x}_{1}+c_{2} \mathbf{x}_{2}+\cdots+c_{n} \mathbf{x}_{n} \\] then the component \(c_{1}\) in the direction of the positive eigenvector \(\mathbf{x}_{1}\) must be nonzero. (d) Show that the state vectors \(\mathbf{y}_{j}\) of the Markov chain converge to a steady-state vector. (e) Show that \\[ c_{1}=\frac{1}{\left\|\mathbf{x}_{1}\right\|_{1}} \\] and hence the steady-state vector is independent of the initial probability vector \(\mathbf{y}_{0}\)
Given that \\[ A=\left(\begin{array}{rrr} 4 & 0 & 0 \\ 0 & 1 & i \\ 0 & -i & 1 \end{array}\right) \\] find a matrix \(B\) such that \(B^{H} B=A\)
Let \(A\) be an \(n \times n\) symmetric positive definite \(\mathrm{ma}-\) trix. For each \(\mathbf{x}, \mathbf{y} \in \mathbb{R}^{n},\) define \\[ \langle\mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{T} A \mathbf{y} \\] Show that \(\langle,\rangle\) defines an inner product on \(\mathbb{R}^{n}\)
Let \(A\) be a nonsingular \(n \times n\) matrix, and suppose that \(A=L_{1} D_{1} U_{1}=L_{2} D_{2} U_{2},\) where \(L_{1}\) and \(L_{2}\) are lower triangular, \(D_{1}\) and \(D_{2}\) are diagonal, \(U_{1}\) and \(U_{2}\) are upper triangular, and \(L_{1}, L_{2}, U_{1}, U_{2}\) all have 1 's along the diagonal. Show that \(L_{1}=L_{2}\) \(D_{1}=D_{2},\) and \(U_{1}=U_{2} [\text { Hint: } L_{2}^{-1}\) is lower triangular and \(U_{1}^{-1}\) is upper triangular. Compare both sides of the equation \(.D_{2}^{-1} L_{2}^{-1} L_{1} D_{1}=U_{2} U_{1}^{-1} ]\)
Let \(\left\\{\mathbf{u}_{1}, \ldots, \mathbf{u}_{n}\right\\}\) be an orthonormal basis for a complex inner product space \(V\), and let \\[ \begin{array}{l} \mathbf{z}=a_{1} \mathbf{u}_{1}+a_{2} \mathbf{u}_{2}+\cdots+a_{n} \mathbf{u}_{n} \\ \mathbf{w}=b_{1} \mathbf{u}_{1}+b_{2} \mathbf{u}_{2}+\cdots+b_{n} \mathbf{u}_{n} \end{array} \\] Show that \\[ \langle\mathbf{z}, \mathbf{w}\rangle=\sum_{i=1}^{n} \bar{b}_{i} a_{i} \\]
What do you think about this solution?
We value your feedback to improve our textbook solutions.