Well, we see I said so complex since they come about from interacting elements that can be people, that can be molecules, that can be cars, that can be whatever you like. And we need to have a name for that's a quite often called those elements. Not elements, not molecules, not people. But we called them agents for the like of having a good name for that. So, it comes about from agents interacting with each other, and reacting to the environment. But one of the critical things that you might have noticed, I mean for instance in the example of the cars driving around, or in the case of your immune system, even though you might think that you're in charge of your own body, you are not. There's no leader, there's no manager in a complex system, in the complexity world. Some consider that to be a very good thing. So, there is no manager. And yet, the end result of all those interactions, lead to structures that actually do something. That actually have a role, that actually have a function. Like your immune system that keeping you alive. And here's an example of a termite pile, and if you would look into that, you will find places where there are the eggs and the places where there are ventilation channels, and there is food storage, and really it's a city. It's very complicated. But there's nobody there saying to one enter, you go left, and you go right and you do this, you do that. Somehow, it all self-organizes. So that's the second term. So the first thing was the emergence, and the second term is self-organisation. And these aspects will come back when we study complex systems time and again. And that's also why we have those aspects in the lectures for you. And so like I said, we find these complex systems everywhere. Here is a set of example, by now you probably got the idea. So, we'll find complexity in pandemics in our case at the end of this hour, I will give you an example of that. Refine them in immune system, in ecosystems, in human societies, and cities. Geoffrey Westville will speak about that and economies, Ryan Archer will speak about that and markets, we find them basically everywhere. And so that's the reason why we really want to understand. Because they are somehow everywhere, so it's either we will never find a good description of it, or at least we can try, we find something and then we might be able to predict how these things are evolving. So one slide, on historical perspective. Because this whole field of complexity science to some extent is new, to some extent we'll be talking about, I'll show you later. But, it's relatively new. So it's good to kind of give you a feeling where it all comes from. And so you could say that in the 17th,18th, and the 19th centuries, that was actually the time where people tried to solve, let's say problems of simplicity. Of course they were complicated problems, but they were not complex. I mean, they were complicated in many sense. I mean, try to be a newsman, right? They were really complicated but there was a set, the number of variables that they were looking into, was very relatively small. And because of that, we call that problems of simplicity. And then, about 1800 and onward, we got this whole new way of thinking in terms of these older systems. Those were driven particularly by Boltzmann and Gibbs. And there, suddenly, we talked about many, many, many systems, just like one mole which is like 10 to the power 23 molecules. And there we have like disorganized, almost chaotic systems and we start to reason about these things in a statistical way. And then about 1948, was Warren Weaver said actually what we need to do, we need to study the kind of organized complexity. So somehow, things are kind of trying to organize themselves or are being organized. And then in '72, Phillip Anderson said, "The way we want to look at these things, once things start to organize themselves, the outcome of that is bigger than just the sum of its elements." If I look at individual behavior of the elements, I will not be able to predict the total system. So the whole is more, but also different, from the sum of its parts. And that's a famous quote that you will come across time and time again. And I thought I just repeat it here, 1972. And then in 1984, I said a Nobel Prize winners and other people set up the Santa Fe Complexity Institute. And then Stephen Hawking, after these people recommends the Stephen Hawking also. So he in 2002 said, "I think that the 21st century is going to be the century in terms of science and in terms of what's going to happen to us and to understand what's happening, it will be the century of complexity." So this is like on one slide if you like, the whole way this thinking started and what has been happening over the last couple of years. So, Sue-Ann when he wrote this, complexity 101 which you have a copy of somewhere in your text. He said, "Okay complexity. Now we understand the bit there is complexity but, well why would we care?" Yeah, he'd say there's a piece of text called complexity. Well, what are complex systems and why should we care? Okay, read it. It's really well done. And so, why should we care? Because they are everywhere. They are in nature, they are in human systems, so even if you care about our world, you won't understand our world, we should care about complex systems. And from being trained as a physicist, you always really, somehow you get back to that kind of concepts and one of the things that is weird with complex systems is they seem to defy the second law of thermodynamics. Seem to defy, meaning they'd take the law of thermodynamics as in the end everything will be chaos. So the arrow of time is in the direction of chaos, if you like. And that's a very fundamental law. Because if that was not true, we would have a patient more relief. We would have eternal, we can build time machines that actually can run forever. And unfortunately we cannot. So, but it seems that some of those, it seems that they actually kind of defy the second law of thermodynamics, and here I come back to this equation here to this reaction there. And I'll give you the details of the reaction in a second. So there actually, while I'm talking, you see those patterns changing all the time. You see there is order. It creates order. It doesn't dissipate away into a milky structure or so. So there is order. It's constantly changing order while I'm talking. There's an amazing thing that's happening there. I mean like I said, the first time people saw this and then we talk about 1950's, they saw that, they just couldn't believe that it was happening. It could not be happening. And so, Belousov, is a guy who tried to write this down. He actually couldn't publish it anywhere. There was no way he could. And so in the end he published it in a journal which was not fairly reviewed. And so nobody considered it to be a realistic thing. And then later on, a guy Zhabotinsky actually managed to convince some editors that this is serious stuff. And so this became known as the Belousov-Zhabotinsky reaction. Okay. So why should we care? Engineering requires also complexity thinking like example like I read car. You really want to take into account not just the mechanics of the car, but also the individual that's in the car and how it will act and react with the thing which is actually driving. So when we talk about cars, we also have talk about the social system, in the social world in which they live, the behavior world in which they live. In policy, we'll find these things so that's another reason why you want to understand it bit. And all these things, why should we care, was actually summarized by Heinz Pagels who said, "The great unexplored frontier is complexity. I'm convinced that the nations and people that master the new science of complexity will become the economic, cultural, and political superpowers of the next century." Sounds pretty good, I'm not sure if it's true, but it sounds at least good. It gets the kind of feeling you say, "Wow, I want to be part of it." And these are all positive news. But there is also a negative news. I want to share at least one example with you. So, some 10 years ago, suddenly the stock price of United Airlines dropped dramatically. Just from one second to another second it dropped dramatically. And nobody understood what was happening there. And later on they found what happened. And what happens is the following thing. Nowadays, lots of the trading that's happening on the stock market is happening by algorithms, not by people anymore. By agents if you like. Let me call them molecules, without calling the people, we'll call them agents. And these agents in this case are algorithms. Little bits and pieces of programs. And those bits and pieces of programs will actually kind of interact with each other. And they can interact with their environment. So what they do, they look around. They actually check like you and I do. They check the newspapers. And when they check the newspaper, they see something, and based on that they make a decision whether they should buy or sell certain stocks. And they make the decision, nowadays. So, what happened in that case, one algorithm discovered, by doing text mining, it discovered a piece of text that described that United Airlines was actually in a very bad situation. So it's trying to sell the stock. And another algorithm saw that happening, and said, "What's happening there?" Said that's not good. And then it looks around to find the same paper and said, "Okay, sell." And then another and so you get the whole avalanche and even a couple of microseconds, actually the whole thing dips. And this happened a couple of times in the recent history where we are not in control anymore. Like I said complex systems. They don't have a controller, and also these algorithms don't have a controller. They just go on, they just do their thing.