1:16

In this lecture, lecture 5, we introduce to one of the basic models in quantum physics,

namely a particle in a harmonic potential, described by energy levels and wave functions

that we know exactly. Here is the groundstate wavefunction

of the particle, at energy E = 1/2.

The square of the wavefunction gives the probability for the particle to be at position x...

And here is the first excited state, with energy 3/2: the second excited state with energy 5/2

and so on and so on... Wait a few moments to create all these states by yourself!

At a given temperature, these energy levels are subjet to the equiprobability principle

and to the Boltzmann distribution. In the lecture, in just a few moments, we will discuss

exactly how this works, and this will lead us very quickly to the density matrix

and the celebrated Feynman path integral that describes the spread of the wavefunctions

through the fluctuations of a path.

We all know that at high temperature, the world is not really governed

by quantum physics. The essence of our approach to quantum statistical mechanics is

a certain transformation called the Trotter decomposition,

8:51

Now let's put the two pieces together, and we find that the probability to be in state

n and at position x is proportional to exp(-beta E_n) * | psi_n(x)|^2.

Before plunging into this subject, please take a moment to download, run and modify

the two programs we discussed in this section. On the coursera website, you will find the

program harmonic_wavefunction.py that implements the recursion of Hermite polynomials. There

is also the nice program harmonic_wavefunctions_check.py that checks that the Schrödinger equation

is solved, that the wavefunctions are normalized, and that they are orthogonal.

10:39

In this lecture, and this your homework this week, the wavefunctions are real-valued,

so psi∗ = psi, but in this week's tutorial, we have to take into account complex wavefunctions,

so we better use the correct formulas from the beginning. Notice that in this equation,

we have two different types of probabilities. We have the thermal probability of the Boltzmann

distribution, and the quantum-mechanical probability of the wavefunctions: two completely separate

worlds meet in this equation.

However, the energy levels and wave functions cannot *normally* be computed , and this expression leads

nowhere, even for simple problems! To make progress, we discard the information

about the energy levels and consider what is called the (diagonal)

density matrix: the probability to be at x which is proportional to the density matrix

rho(x, x, beta) equals to the Σ_n e^(-beta E_n) psi_n(x) psi_n*(x).

We also consider a more general object, the non-diagonal density matrix, which is equal

to rho(x, x', beta) = Σ_n psi_n(x) psi_n*(x').

This is the central object of Quantum Statistical Mechanics. For example, the partition function

Z(beta) is given by the Trace of the density matrix.

13:06

As discussed in previous weeks, the partition function Z is the sum of the probabilities π_n;

but here the n are no longer positions in space, but the energy levels.

We next discuss the three fundamental properties of the density matrix:

First of all, each density matrix possesses the convolution property.

This means that the integral over x' of rho(x, x', beta_1)*rho(x', x'', beta_2) can be

written as an integral over x' over a double sum over n and m. This can be exchanged into

a double sum over n and m over the integral in x'. The orthogonality property that we

just discussed allows us to write this as a Σ_n psi_n(x) e^(-(beta_1 + beta_2) E_n) psi_n*(x''):

in other words, the density matrix rho(x, x'', beta_1 + beta_2).

In this exact equation let us set beta_1 equal to beta_2. We find that the integral over

x' of rho(x, x', beta)*rho(x', x'', beta) is equal to the density matrix rho(x, x'', 2beta). Now

realize that beta = 1/Temperature. So in this equation we compute the density matrix at

2beta, that means at low temperature, through a product over density matrices at high temperature.

We can use this equation if we know the density matrix at high temperature, to compute it

at twice lower temperature. Then we can use it again to compute it at 4 times lower temperature,

8 times lower temperature, and so on, and so on... until we reach the full quantum regime.

16:21

The second property is the free density matrix. We will derive this equation in the beginning

of this week's tutorial, and make sure that you understand the role of the non-diagonal

elements in this density matrix, and we will illustrate this in nice pictures of the entire

density matrix at high temperature; and lower, and lower temperatures. As beta becomes larger,

the variance of the gaussian becomes larger, and the system becomes more and more quantum.

Finally, the third property of the density matrix concerns the high-temperature limit

for a Hamiltonian H = H_free plus a potential V, the density matrix at small beta (high temperature)

is given by rho(x, x', beta) = e^(-beta/2 V(x)) * rho_free(x, x', beta) e^(-beta/2 V(x')).

19:45

In matrix squaring, the subject of the last section, we convoluted two density matrices

at temperature T to obtain a new density matrix at temperature T/2.

By iterating this process, we could go to lower and lower temperatures

starting from the high-temperature quasi-classical limit.

Normally, however, we cannot do this matrix squaring analytically. For a large

number of particles, we soon ran out of space to store a reasonable discretized approximation

of rho(x, x', beta) on the computer, so we cannot do the matrix squaring numerically here.

We now see how the Feynman path integral overcomes this problem, how it leads to the use of Monte-Carlo

methods and to the idea of path sampling. Instead of evaluating the convolution integrals

one after the other, as we did in matrix squaring, let us write them out all together.

So we write the density matrix rho(x, x', beta) = integral dx'' rho(x, x'', beta/2) rho(x'', x', beta/2).

Each of the density matrices at beta/2 can be written as an integral over two density matrices

at temperature beta/4. This gives an integral over dx'', dx''', dx'''' of rho on temperature

beta/4, beta/4, beta/4 and beta/4. Now, each of the density matrices at beta/4 can again be written

as a product over two density matrices at beta/8, and thus this would lead us to multiple

integrals over dx''''' dx'''''' dx''''''' and dx''''''''. The idea we are pursuing is

great, but we are having a notational nightmare..

Let us write {x0, x1, x2, x3 ...} instead of the cumbersome {x, x', x'', x'''...}.

This gives the density matrix... [formula on screen]

For the partition function, which is the trace of the density matrix as we discussed before,

we find that... [formula on screen] x0, x1... xN in these integrals is called

a "path", and we can imagine the variable xk to be at position k beta/N of an imaginary time

variable tau that goes from 0 to beta in little steps of Delta tau which is equal to beta/N. Density

matrices and partition functions can thus be expressed as multiple integrals over paths

variables, so called paths integrals.

In Markov-chain Monte-Carlo, we can move from one path configuration to the next by choosing

one position x_k and making a little displacement delta x that can be positive or negative. We compute

the weight after the move and before the move and accept this move with the Metropolis acceptance

probability. Note that we can also move x0 which is between x1 and x(N-1) so that the

path can move as a whole.

Configurations of a Markov chain simulation for the Harmonic oscillator are shown here.

The histogram of the x-position in this simulation is given by the probability of the particle

to be at position x, or in other words, the density matrix rho(x, x, beta), the diagonal

density matrix.

In Python this gives the program naive_harmonic_path.py, that I ask you to download and to run from

the Coursera website. You will modify this program in this week's homework where you

will do your own Markov-chain Monte Carlo simulation of a Quantum system, or a Path-Integral

Monte-Carlo simulation.

In conclusion, we have plunged in this session of Statistical Mechanics: Algorithms and Computations

into the world of quantum physics and quantum statistical mechanics.

What I have shown you, the case of the harmonic oscillator, can be greatly generalized, as

we will see in the coming weeks.