Orthogonal Bases

Let V be a vector space with a positive definite scalar product \langle\ ,\ \rangle. A basis \{v_1,\cdots,v_n\} of V is said to be orthogonal if \langle v_i,v_j\rangle=0 if i\ne j. In addition, if ||v_i||=1 for all i=1,\cdots,n, then the basis is said to be orthonormal.

Example. E_1,\cdots,E_n of \mathbb{R}^n form an orthonormal basis of \mathbb{R}^n.

Why having a orthonormal basis is big deal? To answer this question, let us suppose that e_1,\cdots,e_n is an orthonormal basis of a vector space V. Let v,w\in V. Then
\begin{align*} v&=v_1e_1+\cdots+v_ne_n,\\ w&=w_1e_1+\cdots+w_ne_n. \end{align*}
Since \langle e_i,e_j\rangle=\delta_{ij},
\langle v,w\rangle=v_1w_1+\cdots+v_nw_n=v\cdot w.
Hence, once an orthonormal basis is given, the scalar product \langle\ ,\ \rangle is identified with the dot product. Next question is then, can we always come up with an orthonormal basis? The answer is affirmative. Given a basis, we can construct a new basis which is orthonormal through a process called the Gram-Schmidt orthogonalization process. Here is how it works. Let w_1,\cdots,w_n be a basis of a vector space V. Let v_1=w_1 and
v_2=w_2-\frac{\langle w_2,v_1\rangle}{\langle v_1,v_1\rangle}v_1.
Then v_2 is perpendicular to v_1. Note that if w_2 is already perpendicular to v_1=w_1, then v_2=w_2.

Gram-Schmidt Orthogonalization Process

Let
v_3=w_3-\frac{\langle w_3,v_1\rangle}{\langle v_1,v_1\rangle}v_1-\frac{\langle w_3,v_2\rangle}{\langle v_2,v_2\rangle}v_2.
Then v_3 is perpendicular both v_1 and v_2 as seen in the following figure.
Continuing this process, we have
v_n=w_n-\frac{\langle w_n,v_1\rangle}{\langle v_1,v_1\rangle}v_1-\cdots-\frac{\langle w_n,v_{n-1}\rangle}{\langle v_{n-1},v_{n-1}\rangle}v_{n-1}
and v_1,\cdots,v_n are mutually perpendicular.

Gram-Schmidt Orthogonalization Process

Therefore, we have the following theorem holds.

Theorem. Let V\ne\{O\} be a finite dimensional vector space with a positive definite scalar product. Then V has an orthonormal basis.

Example. Find an orthonormal basis for the vector space generated by
A=(1,1,0,1),\ B=(1,-2,0,0),\ C=(1,0,-1,2).
Here the scalar product is the dot product.

Solution. Let
\begin{align*} A’&=A,\\ B’&=B-\frac{B\cdot A’}{A’\cdot A’}A’\\ &=\frac{1}{3}(4,-5,0,1),\\ C’&=C-\frac{C\cdot A’}{A’\cdot A’}-\frac{C\cdot B’}{B’\cdot B’}B’\\ &=\frac{1}{7}(-4,-2,-7,6). \end{align*}
Then A’,B’,C’ is an orthogonal basis. We obtain an orthonormal basis by normilizing each basis member:
\begin{align*} \frac{A’}{||A’||}&=\frac{1}{\sqrt{3}}(1,1,0,1),\\ \frac{B’}{||B’||}&=\frac{1}{\sqrt{42}}(4,-5,0,1),\\ \frac{C’}{||C’||}&=\frac{1}{\sqrt{105}}(-4,-2,-7,6). \end{align*}

Theorem. Let V be a vector space of dimension n with a positive definite scalar product \langle\ ,\ \rangle. Let \{w_1,\cdots,w_r,u_1,\cdots,u_s\} with r+s=n be an orthonormal basis of V. Let W be a subspace generated by w_1,\cdots,w_r and let U be a subspace generated by u_1,\cdots,u_s. Then U=W^{\perp} and \dim V=\dim W+\dim W^{\perp}.

Leave a Reply

Your email address will not be published. Required fields are marked *