<< Chapter < Page Chapter >> Page >
This module is part of the collection, A First Course in Electrical and Computer Engineering . The LaTeX source files for this collection were created using an optical character recognition technology, and because of this process there may be more errors than usual. Please contact us if you discover any errors.

Orthogonality. When the angle between two vectors is π / 2 ( 90 0 ) , we say that the vectors are orthogonal . A quick look at the definition of angle ( Equation 12 from "Linear Algebra: Direction Cosines" ) leads to this equivalent definition for orthogonality:

( x , y ) = 0 x and y are orthogonal.

For example, in Figure 1(a) , the vectors x = 3 1 and y = - 2 6 are clearly orthogonal, and their inner product is zero:

( x , y ) = 3 ( - 2 ) + 1 ( 6 ) = 0 .

In Figure 1(b) , the vectors x = 3 1 0 , y = - 2 6 0 , and z = 0 0 4 are mutually orthogonal, and the inner product between each pair is zero:

( x , y ) = 3 ( - 2 ) + 1 ( 6 ) + 0 ( 0 ) = 0
( x , z ) = 3 ( 0 ) + 1 ( 0 ) + 0 ( 4 ) = 0
( y , z ) = - 2 ( 0 ) + 6 ( 0 ) + 0 ( 4 ) = 0 .
Figure one part a is a cartesian graph with x_1 as the horizontal axis and x_2 as the vertical axis. From the origin into the first quadrant is a vector labeled x with the 2 by 1 matrix 3, 1. From the origin into the second quadrant is a vector labeled y with the 2 by 1 matrix -2 6. Figure one part a is a cartesian graph with x_1 as the horizontal axis and x_2 as the vertical axis. From the origin into the first quadrant is a vector labeled x with the 2 by 1 matrix 3, 1. From the origin into the second quadrant is a vector labeled y with the 2 by 1 matrix -2 6. Figure one part b is a three-dimensional graph with x_1 as the axis pointing out, x_2 pointing to the right, and x_3 pointing up. The vector z follows the x_3 axis upwards, and is labeled with the 3 by 1 matrix 0 0 4. The vector y points along the x_2 x_3 axis with a positive slope and is labeled with a 3 by 1 matrix, -2 6 0. The vector x points out and to the right on the x_1 x_2 plane and is labeled with the 3 by 1 matrix 3 1 0. Figure one part b is a three-dimensional graph with x_1 as the axis pointing out, x_2 pointing to the right, and x_3 pointing up. The vector z follows the x_3 axis upwards, and is labeled with the 3 by 1 matrix 0 0 4. The vector y points along the x_2 x_3 axis with a positive slope and is labeled with a 3 by 1 matrix, -2 6 0. The vector x points out and to the right on the x_1 x_2 plane and is labeled with the 3 by 1 matrix 3 1 0.
Orthogonality of Vectors

We can use the inner product to find the projection of one vector onto another as illustrated in Figure 2 . Geometrically we find the projection of x onto y by dropping a perpendicular from the head of x onto the line containing y . The perpendicular is the dashed line in the figure. The point where the perpendicular intersects y (or an extension of y ) is the projection of x onto y , or the component of x along y . Let's call it z .

Figure two is a two-dimensional graph containing three vectors in the first quadrant. The first, x, begins from the origin and has a steep positive slope. The second, z, also begins from the origin and has a more shallow positive slope. The angle between z and x is labeled as θ. The third, y, begins from the end of z and extends with the same slope as z further into the first quadrant. Figure two is a two-dimensional graph containing three vectors in the first quadrant. The first, x, begins from the origin and has a steep positive slope. The second, z, also begins from the origin and has a more shallow positive slope. The angle between z and x is labeled as θ. The third, y, begins from the end of z and extends with the same slope as z further into the first quadrant.
Component of One Vector along Another

The vector z lies along y , so we may write it as the product of its norm | | z | | and its direction vector u y :

z = | | z | | u y = | | z | | y | | y | | .

But what is norm | | z | | ? From Figure 2 we see that the vector x is just z , plus a vector v that is orthogonal to y :

x = z + v , ( v , y ) = 0 .

Therefore we may write the inner product between x and y as

( x , y ) = ( z + v , y ) = ( z , y ) + ( v , y ) = ( z , y ) .

But because z and y both lie along y , we may write the inner product ( x , y ) as

( x , y ) = ( z , y ) = ( | | z | | u y , | | y | | u y ) = | | z | | | | y | | ( u y , u y ) = | | z | | | | y | | | | u y | | 2 = | | z | | | | y | | .

From this equation we may solve for | | z | | = ( x , y ) | | y | | and substitute | | z | | into Equation 6 to write z as

z = | | z | | y | | y | | = ( x , y ) | | y | | y | | y | | = ( x , y ) ( y , y ) y .

Equation 10 is what we wanted–an expression for the projection of x onto y in terms of x and y .

Orthogonal Decomposition. You already know how to decompose a vector in terms of the unit coordinate vectors,

x = ( x , e 1 ) e 1 + ( x , e 2 ) e 2 + + ( x , e n ) e n .

In this equation, ( x , e k ) e k is the component of x along e k , or the projection of x onto e k , but the set of unit coordinate vectors is not the only possible basis for decomposing a vector. Let's consider an arbitrary pair of orthogonalvectors x and y :

( x , y ) = 0 .

The sum of x and y produces a new vector w , illustrated in Figure 3 , where we have used a two-dimensional drawing to represent n dimensions. The norm squared of w is

| | w | | 2 = ( w , w ) = [ ( x + y ) , ( x + y ) ] = ( x , x ) + ( x , y ) + ( y , x ) + ( y , y ) = | | x | | 2 + | | y | | 2 .

This is the Pythagorean theorem in n dimensions! The length squared of w is just the sum of the squares of the lengths of its two orthogonal components.

Figure three is a two-dimensional graph with three vectors beginning from the origin. The first is a vector, y, that extends with a strong negative slope into the second quadrant. The second is a vector, x, that extends with a shallow positive slope into the first quadrant. In between these two vectors, into the first quadrant, is a vector w = x + y, with a stronger positive slope than vector x. Figure three is a two-dimensional graph with three vectors beginning from the origin. The first is a vector, y, that extends with a strong negative slope into the second quadrant. The second is a vector, x, that extends with a shallow positive slope into the first quadrant. In between these two vectors, into the first quadrant, is a vector w = x + y, with a stronger positive slope than vector x.
Sum of Orthogonal Vectors

The projection of w onto x is x , and the projection of w onto y is y :

w = ( 1 ) x + ( 1 ) y .

If we scale w by a to produce the vector z = a w , the orthogonal decomposition of z is

z = a w = ( a ) x + ( a ) y .

Let's turn this argument around. Instead of building w from orthogonal vectors x and y , let's begin with arbitrary w and x and see whether we can compute an orthogonal decomposition. The projection of w onto x is found from Equation 10 :

w x = ( w , x ) ( x , x ) x .

But there must be another component of w such that w is equal to the sum of the components. Let's call the unknown component w y . Then

w = w x + w y .

Now, since we know w and w x already, we find w y to be

w y = w - w x = w - ( w , x ) ( x , x ) x .

Interestingly, the way we have decomposed w will always produce w x and w >y orthogonal to each other. Let's check this:

( w x , w y ) = ( ( w , x ) ( x , x ) x , w - ( w , x ) ( x , x ) x ) = ( w , x ) ( x , x ) ( x , w ) - ( w , x ) 2 ( x , x ) 2 ( x , x ) = ( w , x ) 2 ( x , x ) - ( w , x ) 2 ( x , x ) = 0 .

To summarize, we have taken two arbitrary vectors, w and x , and decomposed w into a component in the direction of x and a component orthogonal to x .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, A first course in electrical and computer engineering. OpenStax CNX. Sep 14, 2009 Download for free at http://cnx.org/content/col10685/1.2
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'A first course in electrical and computer engineering' conversation and receive update notifications?

Ask