0:03

Now, let's introduce the one very important concept concerning

Â the so called linear dependence or linear independence

Â of some finite family of functions on some given interval.

Â Consider the finite set of functions say n functions f_sub_i of x,

Â where i is moving from 1 through the n.

Â Such a finite set of functions f_sub_i of x is said to be linearly dependent,

Â linearly dependent on an interval I.

Â If there are a constant c_sub_i,

Â not all 0 at the same time

Â satisfying linear combination of those f_sub_i of x using these co-efficiencies c_sub_i,

Â which we assume not all 0.

Â If this combination, if this linear combination can be equal to identically 0 on I,

Â in that case, we say this family f_sub_i of x is a linearly dependent on the interval I.

Â Otherwise, we say that the same family are linearly independent on the interval I.

Â In other words, what it means,

Â the family f_sub_i of x are linearly independent on

Â the interval I if their linear combination is identically 0 on I,

Â with some coefficiencies of I,

Â then all the coefficiencies of i must be equal to 0.

Â Then we say that this f_sub_i of x are linearly dependent on the interval I.

Â Let's check this meaning

Â through the simple examples.

Â First, if we have only two functions

Â which I call it rather f of x and the g of x.

Â Then my claim is these two functions,

Â f and g are linearly dependent on

Â some interval I if and only if one is a constant in multiple of the other.

Â You can confirm it very easily.

Â We have only two functions,

Â f and g. They are linearly dependent.

Â Can remind the definition?

Â Such a finite in the family of functions are linearly dependent

Â if there are two constants which I will call a and b.

Â If there is a and b,

Â not equal to zero at the same time,

Â but their linear combination is identically zero or linearly independent on I,

Â identical zero on I for suitable choice of a and b.

Â But these two coefficience is not equal to zero at the same time.

Â That's the meaning of the linearly independence.

Â Since not both of them are equal to zero,

Â at least one of them should be non-zero.

Â So for example, a might not be non-zero.

Â What happened in this case?

Â If a is not equal to zero then you

Â can write f as negative b over a,

Â times g. So one of them,

Â as you can see right here,

Â one of them is really a constant multiple of the other.

Â On the other hand,

Â if b is not equal to zero,

Â that might be true.

Â Then by the same token and now g is equal to negative a over b and the times f. So again,

Â one of them, one of f and g,

Â say g is a constant multiple of the other.

Â That's my claim down here.

Â If you have only two functions,

Â then they are linearly dependent on the interval I if and only

Â if one of them is a constant multiple of the other.

Â As a very concrete example,

Â consider the two functions,

Â say, one is a sin 2x.

Â And the other one is a sin x times a cosine of x.

Â They look totally two different function.

Â But in fact they are linearly dependent on the whole real line,

Â negative infinity, infinity because by the double angle form a sine.

Â Sin 2x is the same as 2 times the sin x cosine of x.

Â So, this first function is a constant multiple of the other,

Â so that these two functions are linearly dependent on this interval.

Â But I would like to emphasize the following thing,

Â linear dependence or the linear independence

Â of a finite family of functions depends not only on functions,

Â but also on the interval over which we tested the linear dependence.

Â What I mean by this,

Â look at the next example then you can understand it easily.

Â In general, if there are more than two functions- In other words,

Â if the little n is greater than or equal three.

Â Little n is a number of functions in board.

Â Then it's not that much easier to use the given definition itself to check

Â linearly dependence or linear independence of family of

Â functions from f_sub_1 of x to f_sub_n of x.

Â That's not a easy task to do.

Â However, when they are solutions,

Â when they are solutions of a given linear homogeneous differential equation.

Â Can you remind of the differential equation (4)?

Â In symbol this is, L(y) = 0.

Â To be precise, what is the L(y)?

Â This is a_sub_n of x,

Â y as derivative, A_sub_n minus 1 of x,

Â n minus 1 derivative of y.

Â And a,

Â 1 of x and y prime plus a_sub_0 of x and of y,

Â and that is equal to 0.

Â We are concerned with the homogeneous equation.

Â This is a homogeneous differential equation, L(y) = 0.

Â Equation number (4).

Â If a family of functions they are coming from the solutions of this homogeneous equation,

Â then there is a much easier test to apply.

Â Much easier test to see whether

Â those family of functions are linearly independent or not.

Â To introduce that criterion,

Â we need to introduce another terminology.

Â Say the Wronskian of some family of functions.

Â So we now consider some family finite limited functions.

Â f_sub_i of x.

Â The is are moving from 1 to n which

Â are at least (n - 1) times a differentiable on an interval I.

Â For those family of functions make an n by n matrix formed by those functions.

Â The first row is just f1, f2, and f_sub_n.

Â Second row will be

Â f1 prime f2 prime and f_n prime.

Â The nth row will be (n - 1) derivative of f1, (n - 1) derivative of f of 2,

Â (n - 1) derivative of f_sub_n.

Â And make its determinant.

Â Here you have n by n matrix formed

Â by those functions f_i and that their derivatives,

Â then makes its determinant.

Â Compute its determinant, we denoted by this assemble,

Â capital W of f_sub_1,

Â of f_sub_n of x and called it

Â the Wronskian of this given family of functions, f_sub_i of x.

Â We needed such the terminology.

Â