So now we have set of vector, set of scalars,
we know how to multiply vectors by scalars and
we know how to put vectors together by addition, but we need something more.
We need something to measure and compare vectors Vectors.
This is provided by the inner product which is an additional
new operator that we defined for the vector space.
The inner product takes a couple of vectors and
returns a scalar, which is a measure of similarity between the two vectors.
If the inner product is zero,
this is a very important concept that we will examine in detail,
we say that the vectors are maximally different or in other words, orthogonal.
Like for scalar multiplication and addition, the inner product is defined
axiomatically and these are the properties that it will have to fulfill.
We want the inner product to be distributive with respect to
vector addition.
We want the inner product to be commutative with congregation.
This, of course, applies when our set of scalars is complex valued.
We want the inner product to be fistributed
with respect to scalar multiplication and
we conjugate the scalar if it affects the first operant in the inner product.
We want the self inner product to be greater than or equal to zero,
which means that the self inner product is necessarily a real number.
And finally, the self inner product can be zero only if the element is
the null element for the vector edition.
Let's look at some examples of inner products to foster our intuition.
In R2 the inner product between two vectors is defined as the product of
the first component of each pair,
plus the product of the second components of each pair.
Now this is not immediately intuitive, so let's take it apart a bit.
First of all if we look at the self inner product of a vector, we can see that it's
equal to x0 squared plus x1 squared, so the sum of the coordinate square.
Now if we look at the graphical representation of the vector we can see
that this by Pythagoras' theorem is equal to the length of the vector square.
We actually have a name for this, we call this the norm of a vector.
And so the norm of a vector is the square root of the self inner product,
and it indicates the length of the vector.
So, the self inner product is a very good measure of the size of a vector
in R2 and by extension in all vector spaces.
Let's go back to the general definition for the inner product.
We could use some simple trigonometry to show that
this formulation is actually equivalent to this.
So the inner product between two vectors
is equal to the norm of the first vector times the norm of the second vector
times the cosine of the angle formed by the two vectors.
Now if the two vectors have equal norm
their inner product will be a good measure of similarity between this two vectors,
because it will go from the norm square when the vector go inside and
therefore angle alpha is zero to zero when the vectors are at 90 degrees and
therefore are pointing in orthogonal directions.
So, orthogonality is indeed the maximal difference
between two vectors on the plane.
As another example, lets have a look at the inner product in L2 of -1, 1.
This is usually defined as the integral of the product of the two functions
that corresponds to the two vectors.
It's very easy to verify that this inner product fulfills the axioms.
Let's now use this inner product to compute the norm
of the vector x = sin of pie t.
This will be equal to the self inner product, and so
the integral from -1 to 1 of sin squared of pie t in dt.
This is the area that is shaded in the picture and it's equal to one.
Let's take another element of L2 -1,
1 the vector y = t, so the linear ramp between -1 and 1.
If we compute the norm as the self inner product of this vector, we have
the integral from -1 to 1 of t squared and dt which is equal to two thirds.
So, this vector doesn't have the unit norm, if we want to normalize this vector,
we can define an alternate version where we say y=
t divided by the square root if two thirds.
We divide the vector by the norm and this vector will now have the unit norm.
The reason why we do that is because now we can use the inner product to compare
in L2 of -1, 1, the function sine of pi t
with the function t divided by the square-root of two thirds.
So, here we have the first function,
here we have the second function, we compute the inner product between the two.
We have to compute this area here, the result of the inner product is 0.78.
Now remember that we're comparing unit norm of vectors so
the inner product can range between one, in the case of maximal similarity,
and zero in the case of orthogonality.
Here we have a value of 0.78 which indicates that the linear ramp and
the sine functions are actually pretty close.
We can take the inner product with maximally the similar functions
if we take our usual x equal to sin of pi t, which is an antisymmetric function.
We take the inner product of this function with a symmetric function, like for
instance, the triangle function, 1 minus absolute value of t.
If we take the integral of the product of these two functions,
we have to sum the green area with the red area.
These two areas have opposite sides, and so the integral and, therefore,
the inner product, will be equal to zero.
So, the inner product has successfully captured the fact that we will
never be able to express any symmetric function in terms of
an antisymmetric shape.orthogonality.
There are maximally different, they have nothing in common,
their inner product is zero.
Another famous case of orthogonality between functions
is given by sinusoids whose frequencies are harmonically related.
So we're still in L2 of -1, 1.
Let's pick as the first function, sin of 4 pi t, and
as the second function, sin of 5 pi t.
These two frequencies are multiples of the fundamental frequency for
the interval, which is pi.
If we compute the inner product between the two, even graphically,
we can see that we have to compute these areas here and sum them together.
Now for each red colored area you have a corresponding
green colored area that has opposite sign.
And so as you sum them together, you end up having a total inner product of zero.