Orthogonal Sets and Projection#
Today we deepen our study of geometry.
In the last lecture we focused on points, lines, and angles.
Today we take on more challenging geometric notions that bring in sets of vectors and subspaces.
Within this realm, we will focus on orthogonality and a new notion called projection.
First of all, today we’ll study the properties of sets of orthogonal vectors.
These can be very useful.
Orthogonal Sets#
A set of vectors \(\{\mathbf{u}_1,\dots,\mathbf{u}_p\}\) in \(\mathbb{R}^n\) is said to be an orthogonal set if each pair of distinct vectors from the set is orthogonal, i.e.,
Example. Show that \(\{\mathbf{u}_1,\mathbf{u}_2,\mathbf{u}_3\}\) is an orthogonal set, where
Solution. Consider the three possible pairs of distinct vectors, namely, \(\{\mathbf{u}_1,\mathbf{u}_2\}, \{\mathbf{u}_1,\mathbf{u}_3\},\) and \(\{\mathbf{u}_2,\mathbf{u}_3\}.\)
Each pair of distinct vectors is orthogonal, and so \(\{\mathbf{u}_1,\mathbf{u}_2, \mathbf{u}_3\}\) is an orthogonal set.
In three-space they describe three lines that we say are mutually perpendicular.
Orthogonal Sets Must be Independent#
Orthogonal sets are very nice to work with. First of all, we will show that any orthogonal set must be linearly independent.
Theorem. If \(S = \{\mathbf{u}_1,\dots,\mathbf{u}_p\}\) is an orthogonal set of nonzero vectors in \(\mathbb{R}^n,\) then \(S\) is linearly independent.
Proof. We will prove that there is no linear combination of the vectors in \(S\) with nonzero coefficients that yields the zero vector.
Our proof strategy will be:
we will show that for any linear combination of the vectors in \(S\):
if the combination is the zero vector,
then all coefficients of the combination must be zero.
Specifically:
Assume \({\bf 0} = c_1\mathbf{u}_1 + \dots + c_p\mathbf{u}_p\) for some scalars \(c_1,\dots,c_p\). Then:
Because \(\mathbf{u}_1\) is orthogonal to \(\mathbf{u}_2,\dots,\mathbf{u}_p\):
Since \(\mathbf{u}_1\) is nonzero, \(\mathbf{u}_1^T\mathbf{u}_1\) is not zero and so \(c_1 = 0\).
We can use the same kind of reasoning to show that, \(c_2,\dots,c_p\) must be zero.
In other words, there is no nonzero combination of \(\mathbf{u}_i\)’s that yields the zero vector …
… so \(S\) is linearly independent.
Notice that since \(S\) is a linearly independent set, it is a basis for the subspace spanned by \(S\).
This leads us to a new kind of basis.
Orthogonal Basis#
Definition. An orthogonal basis for a subspace \(W\) of \(\mathbb{R}^n\) is a basis for \(W\) that is also an orthogonal set.
For example, consider
Note that \(\mathbf{u}^T\mathbf{v} = 0.\) Hence they form an orthogonal basis for their span.
Here is the subspace \(W = \mbox{Span}\{\mathbf{u},\mathbf{v}\}\):