It's time for us to go on, and we are going to examine the name of the course, the words. Here, The Introduction to Quantum Computing. I believe there is no problem with understanding the words introduction, and the, and to. But the words quantum computing may be unclear. So, let's explain them and we start with computation or computer. How would you define this term, this word? If you reflect a bit on this notion, you may find it difficult to define. What does it mean to compute? It is among those words which we take for granted, but if a child asks us to define it, we have problems with that. Actually with children it is easier we can instead of an explanation or a definition, we can provide some examples. But here, it doesn't work because to generalize all these examples into some short and simple definition like this one. The computation is a physical process, finite in time with fixed and distinguishable set of states. A set of distinguishable states. Now you understand that there are many other definitions of this term or this word, computation. Why I do like this the most, and why I choose this for this course? Because first it amplifies the notion of a physical process. We often forget this that the things that compute often and usually rattle, clique, and make different noise like this. They usually, not usually, always they consume or transform, if you want, energy. Now, this notion of states, there are many process in the world which rattle and consume energy but if they don't have this distinguishable states which again identify, then we don't call them a computer. It is often told that computations transform or process information. To agree with that, we have to think again and to define this word, information. I choose to connect this definition with that of computational system of computations since computational process has a fixed state set of distinguishable states. I choose the current state of the system to be the information it presents. I choose the word interpretation here for a reason. Because you see on this slide two different systems, both can store some information. In some cases, we can say that these two systems store the same information. But how can we do that? These are very different systems, they have very different sets of states. How can we say that they store the same information? To do that, we have to define, to invent some conventions about how we would interpret this system states, how would we read information from them. To do that, we first have to define the notion of quantity of information. Because, for example, we might need to establish the fact that one system can store the information stored now by another system. We may already feel that this notion of quantity is somehow connected with the number of states available for this system. The first mathematical description of this notion of quantity of information was introduced in 1948 by an American Mathematician and Crypto Analyst, Claude Shannon. The amount of information that can be stored by a system with N states was defined by his famous formula, the Shannon's formula here. Here, you can see the sum over all states of the probabilities of each state multiplied by the logarithm sum base B of this probability. We want this value to be positive, so this is why there's minus here because all the logarithms here are negative. Now, the probabilities come from Physics. This formula was introduced by Boltzmann to define the entropy of a system. The physicists are interested in much more general set of physical processes. But for computational processes at least designed by human, all this probabilities usually are equal to each other. So, for any I and J probability of the state I equals to the probability of state J and everything is equal to one divided by the number of states, by N. In this case, for the case of equiprobable states, we have simplification of this formula. Instead of PI, we just write one divided by N, and logarithm base B of Y one divided by N, and we can invert this logarithm like this and remove this minus. Now, we have this sum of equal elements, there are N elements. So, we can write it like this and we can now erase this and we have this logarithm. So, the amount of information stored by a system with N equiprobable states is just the logarithm base B of the number of states. Now, let's take a closer look at this value B here. Actually it is a matter of convention, and the smallest natural base for the logarithm is two. If I take a system with only one state, you see it is zero. So, a system with only one state stores no information. The next number is two. So, logarithm two, two equals to one. This is the amount of information, the smallest amount of information that can be stored by a physical system. Claude Shannon called this smallest amount of information whose logarithm is the smallest base, he named it bit, as a small piece of something in English. Now, we can measure information in bits. Imagine a system which stores n bits. That means that this system has two in the power of N states. It is the same amount as the amount of integer numbers which can be represented by N binary digits. So, we can now enumerate the states. We can assign each state its number. If you have two systems, each of them which can store n bits, and the states of these systems are enumerated like this, then we can map the states of different systems. We can now say that these different systems store the same information. Let's consider an example. The system depicted here was never implemented physically, it was designed as a thought experiment by a Hungarian physicist, Leo Szilard, in 1929. So, the system is like this. It is an empty camera. It is partitioned, removable partition here. Which divides the camera in two parts. There is one particle of gas in the camera of the temperature T. By the size of the camera, there are pistons which can be moved inside the camera, but they can't be moved outside because of these stoppers here. We will distinguish two states for this system. The particle stays somewhere here in the left part of the camera. We'll call this state zero. If the particle stays in the right part of the camera, we'll call this, one. So, this system can store one bit of information. Okay. When the partition is removed, it is not actually a computational system, computational process, because we can't distinguish these two different states. We can say in which part of the camera, now we can find the particle. But we can do this, we can assign this system a particular value. To do that, we have to remove this partition if it's not yet removed, and then we are going to compress the gas. For example, we want the system to represent the value zero. So, we fix this piston, and we move the right piston on the left. So, we compress our one particle gas, and we stop when this right piston reaches the middle of the camera, and replace the moveable partition. Then, we can move our right piston back. Now, we are sure that the particle stays here in the left part of the camera, and this process took some effort from us, which equals to a constant, the Boltzmann's constant, the temperature of the gas and the logarithm of the fraction of volumes. We decreased the volume two times, so this equals to two. This is how much work we need to assign a value, and this is a very interesting fact. To assign a value, we have to dissipate some energy. Let's do something else. For example, we now have this value or this bit equal to zero, let's apply the NOT gate which maps zero to one. How can we do this? We now have this state. Now, we can move this right piston here to the partition, and this doesn't take any effort because we don't compress the gas, there's no gas on the right side. Then, we remove the partition, and then we move both pistons to the right like this. The volume of the gas does not change when we place this volume on the right part of the camera. We place the partition again and then we remove or move back the left piston. Well, we applied to the gate NOT, we're now pretty sure that the particle stays in the right part of the camera, so this system stores the bit one. We did not dissipate any energy. It didn't do any work. So, our gate NOT, the reversible gate doesn't take any energy, and this system doesn't need any energy. Now, assume that we don't need to store these bits anymore, what can we do? We, just like in this case here, we can just remove the partition and let this particle fly all around the camera. But we can do something more clever or something more smart. We can put this piston back here with again no work, and then remove the partition and let this particle to push this piston back. So, this compressed gas will give us this amount of energy back, and this operation is called erase. Now, what does it mean? It means that if we know in which part of this camera we now have a particle, we can transform this knowledge into energy. So, the information in some sense equals to energy. Imagine that we didn't have this partition, and at each point of time, we can find out where this particle is. So, we can place the partition and move the piston of our choice. So, we can extract some work, some energy of this system. So, the information in some sense equals to energy, and this is why it's often said that information is power. Bad news for Plato and Socrates, though.