>> We're now going to introduce Brownian Motion.
Brownian Motion is a very commonly used stercastic process in finance.
It is the process that underlies the Black-Scholes methodology and we're going
to discuss it now. So, let's define our Brownian Motion
first. We say that a random process or stercastic
process xt where t greater than or equal to 0 is a Brownian motion with parameters
mu and sigma if, for the following fixed times: t1 less than t2 up to tn.
The following increments: xt2 minus xt1, xt3 minus xt2, up to xtn minus xtn minus
1, if they are mutually independent. For s greater than 0, xt plus s minus xt,
must have a normal distribution with mean mu-s, and variance sigma squared s.
So notice mu and sigma are the parameters of the Brownian motion, and the increment,
so this is an increment of length s, it's xt plus s minus xt, that increment must
have mean, mu-s variance sigma squared s and be normally distributed.
And the third condition that must be satisfied is that xt is a continuous
function of t. In other word, if I was to plot and we'll
see this in a moment, Brownian Motion, okay through time, in fact it's actually a
lot more jagged than I've shown you here, but it actually never jump.
So I can draw a path of Brownian Motion with my pen never leaving the page.
And that's what I mean when I say xt is a continuous function of t.
We say that xt is a b-mu sigma Brownian motion.
Mu is the drift, okay and sigma is the volatility.
Property number one, is often called the Independent Increments Property.
So they're among the first people to introduce Brownian Motion from a
mathematical viewpoint as we've defined here.
Were Bachelier in 1900 and Einstein in 1905, it's interesting that Bachelier,
very little is known about him, he was a French mathematician and in fact, it turns
out he, he has had a great role to play in, in, in finance.
He was trying to model stock prices on the Paris stock exchange way back in 1900 and
he tried to introduce the idea of a Brownian motion to do that.
So it's very interesting to see, that a concept as important as Brownian motion,
which is used throughout the physical sciences and engineering was actually
introduced by Bachelier in a financial context.
Wiener, in the 1920s, was the first to show that it actually exists as a well
defined mathematical entity. So Brownian motion, it's a hugely
important stochastic process, and it plays a very big role in, in finance as well.
Some other pieces of information when mu equals 0 and sigma equals 1, we have
what's called a standard Brownian motion. We will use wt to denote a standard
Brownian motion, and, we also assume that it begins at 0.
So w0 is equal to 0. Note that if Xt is a b-mu sigma Brownian
motion and X0 equals little x then we can write Xt equals little x plus mu-t plus
sigma Wt, where Wt is our standard Brownian motion.
We therefore see that Xt, is normally distributed with mean x plus mu-t and
variance sigma squared plus t. Because of course, if Xt equals this, then
the expected value of Xt is equal to the constants X plus mu-t plus sigma times the
expected value of Wt. Wt is a standard Brownian motion, so has
mean mu equals 0 times t, so this is equal to, X plus mu-t and the variance of Xt.
Well the constants don't matter, they don't factor into the variance, so the
variance of Xt is equal to sigma squared times the variance of Wt.
And the variance of a standard Brownian motion has sigma equals 1, So it's equal
to sigma squared times t, and that's where we get this calculation from here.
So here's a sample path for Brownian motion.
I've been simulating this Brownian motion by simulating these increments, which are
normally distributed between t equals 0, and t equals 2 years.
So this axis represents, a time period of 2 years and I've been simulating a
Brownian motion. I've-, assuming it's-, I've been assuming
it starts at a hundred, so I'm thinking maybe of a security price, although you
wouldn't model a security price as a Brownian motion typically, but for the
purposes of this demonstration you can think of doing so.
So, that's one sample path, here's another one.
We can see there's lots of different behavior, very jagged, In fact if I was to
zoom in here, you would see that jaggedness still up here.
Key thing to note is that these paths they're continuous, even though they're
very jagged none of them jump, okay so again, I could draw one of these paths,
and make sure that my pen never leaves the page.
It doesn't suddenly jump from here down to here, okay?
So Brownian motion is continuous, the paths of it are continuous.
On this slide, I want to introduce an important fact about Brownian motion, but
before I do so, let us review by what we mean by an information filtration.
For any random process, we will us Ft to denote the information available at time
t. And then Ft for all values of t greater
than or equal to 0 is called the information filtration.
And we actually discussed this in a previous module when we spoke and
introduced, when we spoke about and introduced Martingales.
This quantity here, this expectation, conditional on Ft, then denotes an
expectation conditional on the time t information that's available to us.
And usually, it would be very clear what that information is.
So really, this, this information filtration ss just a mathematical way of
describing what is intuitively obvious to us anyway.
The important fact I want to introduce is the following.
The independent increments property of Brownian motion implies that any function
of Wt plus s, minus Wt is independent of Ft.
In other words, knowing all of the information available at time level t,
that tells us nothing about the increment Wt plus s minus Wt.
So that in the predictor means that Wt plus s minus Wt is normal with mean 0 and
variant, variance s, and that's in-, and that's even conditional on time Ft
information. So let's do a calculation with Brownian
Motion. We probably won't use this calculation
during the course, but there's no problem in doing such a calculation, and it helps
improve our intuition of what's going on with the Brownian Motion.
So let's compute the expected value at time 0, conditional times 0, information
of Wt plus s times Ws. Well we can use a version of the
conditional expectation identity to obtain the following.
So Wt plus s, I can rewrite this as Wt plus s minus Ws plus Ws.
Okay, and then i'm multiplying by Ws outside, so that's this second Ws out
here. I can multiply through this Ws through
the-, this term here and break it down into two terms.
I get Ws times this term, so that's what comes into the first term here, and then I
get Ws times Ws is Ws squared, and that goes to that term, so this goes here.
So now I've got two terms. Well the first thing is, let's deal with
this guy first. I claim that the expected value of Ws
squared is equal to s, how do I know that? Well I know that because of the following.
I know, that s is equal to the variance of Ws, but the variance of Ws is of course
equal to the expected value of Ws squared minus the expected value of Ws all to b
squared. But the expected value of, Ws is equal to
0, because it's a standard Brownian Motion, so therefore the variance at w s
is just the expected value of Ws squared, and that's equal to s as we've seen over
here.So that handles this second term on the right hand side of 9.
How about the first term? Well a version of the conditional
expectation identity implies the following.
So I want to compute the expected value of Wt plus s minus Ws times Ws, so what I'm
going to do is first condition on time s information.
So I can actually rewrite this expectation by conditioning first of all in time s
information, I get dou-, Wt plus s minus Ws times Ws, conditional on time s
information this Ws term can actually come outside this inner expectation, which is
where it is over here. And I'm left with, inside the inner
expectation with Wt plus s minus Ws. But that important fact over here.
So let's call this star. That important fact tells me, that this
guy is normal with mean 0 and variance, in this case t.
So therefore, the expect-, and it's independent of Fs.
So therefore, this quantity here, this inner expectation, has expected value 0,
and that's where the 0 comes from, and so I get 0 here.
And so, therefore what we've shown, is that the expected value of Wt plus s,
times Ws, is equal to s.