[MUSIC] >> Hi, I'm [FOREIGN] from Sungkyunkwan University, in the next three videos we will explore the Knightsbridge and algorithm. And this is the first video, I want to say about the probability theory. And this is today's content I want to share about the probability and conditional probability. First is probability, it's a firm mathematical foundation. And the probability is used generally in AI approaches such as night vision model, Bayesian network, hidden Markov model and so on. And here are the basic property of probability. The first, if event A belongs to B and the event B belongs to the whole data set, then the probability of A less than or equal to probability of B. And they are between the 0 and 1. The second, if event A and B belongs to the whole dataset, then the probability of A and B plus the probability of A and B equal to the probability of A. So, if A and B belong to the whole data set, then the probability of A or B equals 2, probability of A plus the probability of B minus, the probability of A and B. And these three are the basic property or probability. And the next concept is the marginalization. And if Hi belongs to S the Hi and the whole dataset U, had the intersection and if we did the. And of Hi and H2 to the HN all of the unions is the same with the whole dataset S. Then, the probability of A equals to the probability of A and H1, plus probability of A and H2 to the probability of A N H E N. And this concept is very important to to the probability calculation. The first we want to calculate the probability of A with these figures, and it can be calculated the number of event A over number of events S. Instead of calculate that, we can just adding all of the these probabilities like this. Next concept is conditional probability. It's very similar but there is just a little bit different with the probability. The definition of the conditional probability it is shown as this formula and the figure. And this is the simple example of the conditional probability. The even A is when we roll into dices concurrently and the sum of it makes 7, for example, (1, 6), (2,5), (3,4), (4,3), (5,2) and (6,1). It can be calculated like this, the P(A)= 6 over 36. And the Event B, also we're rolling the diocese two dioceses, and the first ties make stock even numbers. So we can get the age 18 cases, and if we want to get the P(B\ A)= 3 over 18. So we can, consider about the probability of the conditional probability, like this. And the figure shows about, conditional probability. And we can also, check the conditional probabilities properties, like this. If the intersection of A and B belongs to the intersection of B and C, then the probability of a given C less than or equal to. The probability of B, given C they are between the gyro two, one. And the second is the, if event A and B belongs to the whole data set. Then, the probability of A and B, given U + the probability of A and not B given C = the probability of A given C. The surd one. If event A and B belongs to the whole dataset, then the probability of the union of A and B, given C = P(A \ C) + P (B \ C)- P(A, B \ C). It is exactly the same. We eliminate the conditional path. So, we can expand this idea to the marginalization. We've already seen about the property of marginalization at the probability, and this is a conditional probability. So, we can expand the concept to the here. As I said, the marginalization want to calculate this whole probability. And this concept shows the, split of the sections and then some of we. We can also expand this concept to the conditional part. And there is one more thing, important thing is, the chaining. If we want to calculate the P (A, B, C)= P(A\B, C) and we can multiply it to the P(B\C) and we can also multiply it to the P(C). So this is the chaining, and we can expand it with these older formulas. And it's the variation of the chaining rule. So with this one, we can manipulate all of the calculation of the probability calculations. And this is another variation of the chaining rule. So if you want to calculate the P (A\B, C) over the P(B \C). And something like that, we can expand our formulas. And with this video, we've seen about the basic property of the probability and conditional probability. And I've said about, the marginalization and training at the conditional probability. The two concepts are very useful to calculate the probability calculation in the next algorithms. Thank you. [MUSIC]