Hi, everyone will come back to operations research. So today this is our week seven and now we want to talk about some applications. Okay, but when we say applications today, we're not going to solve transportation problems, to solve pricing problem, to solve investment problem or production problems. We don't want to solve specific problems in a specific field. Today what do we want to talk about are applications for model building. And here the models are abstract and theoretical so that they may be applied to all kinds of problems. So today the applications are mainly about nonlinear programming. Because according to my personal experience, if you learn simplex method, you know what's that? If you learn branch and bound, you know what's that. L P and I P even though they are also difficult. But you know, where is the application. But for nonlinear programming, once you learn a lot of gradual, once you learn the KKT, you need some more exposures to applications. So that you really know how they may be used or what's the good part of them? Even the very simple unconstrained convex analysis require you to get some applications exposure. So that you really know why they are really needed. Okay, so we talk about two things that today everybody wants to learn. The first thing is linear regression. Okay, so for linear regression, I guess everybody knows what's that? Or because we all learned it from high school. You have a few points. Something like this. For example, these are height of a lot of persons, and then these are the weight of all of them. Typically taller persons, they have larger weight, typically. So if we want to find some kind of relationship between them, we all know that we may draw regression line. And the regression line somehow minimize the squared error. Okay, so we do some kind of least square approximation to find this regression line. So for this particular part, we're going to define the regression line, which is the line that minimize squared error. We're going to tell you what's that. And then show you how the slope and the intercept may be determined by solving a nonlinear program. And then, of course, generalize it to un dimensional, okay. So that's linear regression. So for linear regression or regression, we know there is a property saying that is why it's dependent variable is continuous. So for another case, we're going to talk about classifications, okay. So for classification problem, again you have a lot of attributes. But now what you want to predict, what you want to get determined is discreet. For example, there are all kinds of customers coming into your store, but only some of them buy your products. You want to find the features that may help you predict who is going to buy your product, who is not going to buy your product. That's classification. We're going to talk about one very specific model for classification is called Support Vector Machine. So the reason to talk about Support Vector Machine first, it's important. Second is classic. Third is widely used. Fourth, it requires nonlinear programming. Is a very specific applications of constrained optimization in nonlinear programming. Pretty much it says that you have some points here. Okay, so when I mark it as a circle, that as soon there are consumers that bought your products. When I make across that as soon is some customers, they did not buy your product. And that they may be described by their ages, by their incomes and so on. So somehow you have attributes and your point locate inside your attribute space. And what we want to do in classification is that, we want to find a line so that we may separate this particular point. Separated those person who purchase your product and those persons who don't purchase your product. So finding that line and try to minimize classification error. Or somehow try to maximize the distance between these two parts. So we're going to give you specific and precise definition of I just mentioned. But you may see that we are maximizing the separation or minimize some kind of classification errors. If that's the case, that's optimization problems, right? We're going to show you how this problem may be formulated into a constraint optimization problem in nonlinear programming. And how those Katie LaGrange, how those techniques may somehow be used to deal with these kinds of problems. So that would definitely be fascinating applications. And hopefully that does not just introduce you these ideas, but also help you review and understand the theories more