Chapter 14: Problem 331
a) The vectors \(\\{(1,0),(0,1)\\}\) form an orthonormal basis for \(\mathrm{R}^{2}\). Find another orthonormal basis for \(\mathrm{R}^{2}\). b) The vectors \(\mathrm{u}_{1}=(1,1,1,1), \mathrm{u}_{2}=(1,-1,1,-1)\) and \(\mathrm{u}_{3}=(1,2,-1,-2)\) are orthogonal. Orthonormalize them.
Short Answer
Expert verified
a) Another orthonormal basis for 鈩澛 is {(\( \sqrt{2}/2 \), \( \sqrt{2}/2 \)), (-\( \sqrt{2}/2 \), \( \sqrt{2}/2 \))}.
b) The orthonormal basis obtained from the given orthogonal vectors is: {(1/2, 1/2, 1/2, 1/2), (1/2, -1/2, 1/2, -1/2), (0, 3/鈭14, -2/鈭14, -1/鈭14)}.
Step by step solution
01
a) Finding Another Orthonormal Basis for 鈩澛
To find another orthonormal basis, we can use a rotation approach. Let's consider a rotation by an angle 胃.
For an angle 胃 = 45掳 or 蟺/4, we can apply the rotation matrix:
\[R(\theta) = \begin{bmatrix} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{bmatrix}\]
and apply it to the given orthonormal basis vectors (1,0) and (0,1).
1. Find the coordinates of the new basis vectors v鈧 and v鈧:
\[v_1 = R(\theta) \cdot \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \begin{bmatrix} \cos(\theta) \\ \sin(\theta) \end{bmatrix}\]
With 胃 = 蟺/4, we have:
\[v_1 = \begin{bmatrix} \sqrt{2}/2 \\ \sqrt{2}/2 \end{bmatrix}\]
2. Calculate v鈧 in a similar manner:
\[v_2 = R(\theta) \cdot \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} -\sin(\theta) \\ \cos(\theta) \end{bmatrix}\]
With 胃 = 蟺/4, we have:
\[v_2 = \begin{bmatrix} -\sqrt{2}/2 \\ \sqrt{2}/2 \end{bmatrix}\]
So, another orthonormal basis for 鈩澛 is {(\( \sqrt{2}/2 \), \( \sqrt{2}/2 \)), (-\( \sqrt{2}/2 \), \( \sqrt{2}/2 \))}.
02
b) Orthonormalizing the Given Orthogonal Vectors u鈧, u鈧, and u鈧
We will now use the Gram-Schmidt process to orthonormalize the given orthogonal vectors.
1. Normalize u鈧:
\[v_1 = \frac{u_1}{||u_1||} = \frac{\begin{bmatrix} 1 \\ 1 \\ 1 \\ 1 \end{bmatrix}}{\sqrt{1^2+1^2+1^2+1^2}} = \frac{\begin{bmatrix} 1 \\ 1 \\ 1 \\ 1 \end{bmatrix}}{2}\]
So, \(v_1 = (\frac{1}{2}, \frac{1}{2}, \frac{1}{2}, \frac{1}{2})\).
2. Define w鈧 = u鈧 - (u鈧傗媴v鈧)v鈧 and normalize it to obtain v鈧:
\[w_2 = \begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix} - \left(\begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix} \cdot \begin{bmatrix} 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \end{bmatrix}\right)\begin{bmatrix} 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \end{bmatrix}\]
\[w_2 = \begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix} - 0\begin{bmatrix} 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \end{bmatrix} = \begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix}\]
Now, normalize w鈧 to obtain v鈧:
\[v_2 = \frac{w_2}{||w_2||} = \frac{\begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix}}{\sqrt{1^2+(-1)^2+1^2+(-1)^2}} = \frac{\begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix}}{2}\]
So, \(v_2 = (\frac{1}{2}, -\frac{1}{2}, \frac{1}{2}, -\frac{1}{2})\).
3. Define w鈧 = u鈧 - (u鈧冣媴v鈧)v鈧 - (u鈧冣媴v鈧)v鈧 and normalize it to obtain v鈧:
\[w_3 = \begin{bmatrix} 1 \\ 2 \\ -1 \\ -2 \end{bmatrix} - \left(\begin{bmatrix} 1 \\ 2 \\ -1 \\ -2 \end{bmatrix} \cdot \begin{bmatrix} 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \end{bmatrix}\right)\begin{bmatrix} 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \end{bmatrix} - \left(\begin{bmatrix} 1 \\ 2 \\ -1 \\ -2 \end{bmatrix} \cdot \begin{bmatrix} 1/2 \\ -1/2 \\ 1/2 \\ -1/2 \end{bmatrix}\right)\begin{bmatrix} 1/2 \\ -1/2 \\ 1/2 \\ -1/2 \end{bmatrix}\]
\[w_3 = \begin{bmatrix} 1 \\ 2 \\ -1 \\ -2 \end{bmatrix} - 0\begin{bmatrix} 1/2 \\ 1/2 \\ 1/2 \\ 1/2 \end{bmatrix} - 2\begin{bmatrix} 1/2 \\ -1/2 \\ 1/2 \\ -1/2 \end{bmatrix} = \begin{bmatrix} 1 \\ 2 \\ -1 \\ -2 \end{bmatrix} -\begin{bmatrix} 1 \\ -1 \\ 1 \\ -1 \end{bmatrix} = \begin{bmatrix} 0 \\ 3 \\ -2 \\ -1 \end{bmatrix}\]
Now normalize w鈧 to obtain v鈧:
\[v_3 = \frac{w_3}{||w_3||} = \frac{\begin{bmatrix} 0 \\ 3 \\ -2 \\ -1 \end{bmatrix}}{\sqrt{0^2+3^2+(-2)^2+(-1)^2}} = \frac{\begin{bmatrix} 0 \\ 3 \\ -2 \\ -1 \end{bmatrix}}{\sqrt{14}}\]
So, \(v_3 = (0, \frac{3}{\sqrt{14}}, -\frac{2}{\sqrt{14}}, -\frac{1}{\sqrt{14}})\).
The orthonormal basis obtained from the given orthogonal vectors is: {(1/2, 1/2, 1/2, 1/2), (1/2, -1/2, 1/2, -1/2), (0, 3/鈭14, -2/鈭14, -1/鈭14)}.
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with 91影视!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Rotation Matrix
A rotation matrix is a fundamental tool in linear algebra. It helps in rotating points and vectors in a coordinate plane. When rotating vectors in \( ext{R}^2 \), the rotation matrix can be written as:\[R(\theta) = \begin{bmatrix} \cos(\theta) & -\sin(\theta) \ \sin(\theta) & \cos(\theta) \end{bmatrix}\]
To get a new orthonormal basis using the rotation matrix, multiply the rotation matrix by the original basis vectors. For example, rotating vectors by 45 degrees (\( \theta = \frac{\pi}{4} \)):
To get a new orthonormal basis using the rotation matrix, multiply the rotation matrix by the original basis vectors. For example, rotating vectors by 45 degrees (\( \theta = \frac{\pi}{4} \)):
- The first vector is transformed to \( (\sqrt{2}/2, \sqrt{2}/2) \)
- The second vector becomes \( (-\sqrt{2}/2, \sqrt{2}/2) \)
Gram-Schmidt Orthonormalization
The Gram-Schmidt orthonormalization is a method used to convert a set of vectors into an orthonormal set. Even if the vectors are initially orthogonal,this process scales them to unit vectors.
In the process, you start with the given vectors, like \( u_1, u_2 \), and \( u_3 \) in \( ext{R}^n \). For each vector:
In the process, you start with the given vectors, like \( u_1, u_2 \), and \( u_3 \) in \( ext{R}^n \). For each vector:
- Normalize the first vector to create the first orthonormal vector.
- Subtract the vector's projection from successive vectors to eliminate their component in the direction of the orthonormal vectors that have been found.
- Normalize the resultant to get orthonormal vectors.
Orthogonal Vectors
Orthogonal vectors are vectors that meet at right angles in space. Mathematically, two vectors are orthogonal if their dot product is zero. This implies\( \mathbf{a} \cdot \mathbf{b} = 0 \).
The concept of orthogonality is pivotal in forming an orthonormal basis.
The concept of orthogonality is pivotal in forming an orthonormal basis.
- Begin with orthogonal vectors, e.g., \( u_1, u_2 \), and \( u_3 \), to simplify orthonormalization tasks.
- You can use Gram-Schmidt to adjust lengths, transforming orthogonal to orthonormal vectors.
Normalization
Normalization is the process of adjusting the length of a vector to make it a unit vector, which means its magnitude is one.
This is crucial when forming an orthonormal basis. The formula to normalize a vector \( extbf{u} \) is:\[\textbf{v} = \frac{\textbf{u}}{||\textbf{u}||}\]Here \( ||\textbf{u}|| \) is the magnitude or length of the vector, calculated using the square root of the sum of the squares of its components.
This is crucial when forming an orthonormal basis. The formula to normalize a vector \( extbf{u} \) is:\[\textbf{v} = \frac{\textbf{u}}{||\textbf{u}||}\]Here \( ||\textbf{u}|| \) is the magnitude or length of the vector, calculated using the square root of the sum of the squares of its components.
- For example, the vector \( \textbf{u} = (1, 1, 1, 1) \) becomes \( \frac{1}{2} (1, 1, 1, 1) \) after normalization.
- Ensuring each vector in a set has a magnitude of one is how we ensure they can form an orthonormal basis.