Okay, discrete optimization, mixed integer programming, branch and cut,

second lecture. Okay?

So, this is the real stuff now. Okay?

So one of the things that we saw in the last lecture is that we can generate

these polyhedron cuts. Okay?

But one other thing that we did was basically adding them all inside Inside

the, you know, formulation, the beginning.

That's not exactly how branch and cut works, so what I want to go and, what I

want to do now, is take this concept and then tell you really what a branch and

cut algorithm is doing. Okay?

So we're going to see two things, cover cuts, okay, and you'll see why, you know,

we, we, we'll talk about them, they're very, you know, very frequent in in mixed

integer programs, and then we move to the TSP, which is probably the biggest

success stories of optimization. We can solve really really large TSP

optimally at this point. Okay, so cover caps first.

Cover caps we are looking at constraint which are inequalities.

Okay, linear inequalities, a sum of coefficients and variables which are

smaller than b The simplest kind of stuff that you find all the time inside your

math. Okay?

So sum, you know, coefficient, ai, aj, you know, multiplied by a, a binary

variable, xj 01, is to be smaller than b, okay?

So, so you know can we find facets for these constraints okay.

Can we find facet of the pole which have arrived from these constraints okay.

And the key concept here you know the title is cover cut right is the concept

of cover and essentially what you need to think about is a cover is going to be a

set of variables if you set them all to 1 you're going to violate the constraints

okay. It's basically telling you these

variables not all of them can be 1. Okay?

And so basically, you know, is what you do is you look at the subset of these

variables. Okay?

You look at the coefficients and if you sum all these coefficient, they are

greater than b. And therefore, you know, you know that if

you select all these variables, put them to 1, you going to violate the

constraints. Okay?

So, that's the concept of cover. Basically, these variable cannot all be

selected at the same time. Okay?

And so a cover is going to be maximal, minimal, okay so that's the opposite of

maximal too, it's going to be minimal if you remove one of these variables, okay,

it's not a cover anymore, okay? It's like the strongest possible cover,

okay? So, so, so when you look at these covers

now, so we have a particular constraints there, we take a cover, We, this, so I'm

going to show you an inequality which is valid.

Okay? And so what we do is basically that's

what I told you before. Right?

If we select all these variables at the same time.

Right? We can evaluate the constraints.

So the constraints is basically say, saying, at least one of them has to be

zero, or the sum of all these guys ahs to be smaller than the size of this cover

minus one. You can't select all the variables.

This is what this is telling you. You can't select all the variables.

So in a sense the sum of them is to be smaller than the size of the set minus

one, or at least one of them has to be zero.

We can think about it hopefully, and that's what we're going to do anyway.

Okay? So these are the cover cuts, okay?

So if you look at these constraints here, this is a big constraint, right?

So, seven variables, with coefficient, you know, 11, 6, 6, 5, 5, 4, 1, OK?

And that has to be smaller, the sum of these things have to be smaller, these

variables. Have to be smaller than 19, okay?

Can we find a cover cut, okay? So that's going to be easy, right?

So we take in this particular case the coefficients by, you know, this is

decreasing order. I made it easy for you, right?

So this is 11, 6, 6. So that's already 23, right?

So this is, this is greater than 19. So this is This is a cover, okay.

And, but, because of that, you know that x1 plus x2 plus x3 has to be smaller or

equal to 2. You can only select two of these

variables, hence, you know, to have a feasible solution.

Okay? Now let's look at x4, x5, x6, x3, x4, x5,

and x6. Okay, if you sum all these things

together Once again you get 20, this is greater than 19, so this is also a cover

and this is the cover inequality that results from them, okay.

The sum of these four variables has to be smaller or equal to 3, okay.

So this is essential, so you, you start from inequality like this with, you know,

big coefficient, and then you have a simple cover inequality there, which is

limiting the set of the sum of, you know, a subset of the variables.

Okay? These are cover cuts, okay?

Now, you can extend, you can make them stronger, and the key idea is that if you

ever cover cut, if you ever cover cut, you can extend it by taking other

variables. Whose coefficients are greater than all

the coefficients of the count. Okay?

So this is what this formula is explaining.

So know instead of leaving the sum all over the set, you have this extended set,

okay? You know you extend the cut c, okay, and

what you do is you take up usually c but also all the other you know variables

whose coefficients is greater than all the coefficients of, of C.

Okay so, you know, by definition that these variables okay so they will, they,

they will essentially, they will, the cut will be, will be stronger.

Let me show you an example, it is easier on the example.

So you see essentially the, the cover cut that we have seen.

We add the cut which is x4, you know, x3, x4, x5, and x6 has to be smaller or equal

to 3. You know, you see that the coefficient of

x1 and x2 are greater than all these coefficients, so you make the case

stronger by adding x1 and x2 on this. If the coefficient would be weaker it

would not be you know, as strong. But here the coefficients are always at

least as strong as the coefficients of the other ones, okay?

So, that's the stronger cover. Okay so this is the idea of branch and

cut, okay? So the branch and cut, you know, consists

of modeling first if the problem is okay. Now have a bit, okay.

So you can use the linear relaxation if the linear relaxation is giving you an

integral solution. Finish okay you have an optimal solution.

Otherwise what you do is, you basically find the polyhedral cut, okay, which cuts

the optimal solution to the linear relaxation.

That's the key step, okay? And obviously you would like to have a

facet if it's possible. And so if you find such a beautiful

thing, right? So what you're going to do is, you go

back to step 2, you solve the linear relaxation, it's stronger.

Why is it stronger? Because you cut the linear, the optimal

solution to the linear relaxation, and you iterate this stuff.

Okay, and at some point you can't find any of these cuts anymore because you've

found all of them or it's too tough to actually find them okay.

And then what you do is branch and then you can repeat these steps, or you can at

some point decide that's it's not even worth generating more cuts okay.

But this is the basic framework for branch and cut okay.

Now the key point here okay, is that we have to generate these polyhedral cut

that are going to remove the value of the optimal solution.

Okay, the linear relaxation, okay? So this is what we need to do, okay?

So we are not going to do what I did in the previous lecture, which is just

listing all the possible cuts we could find.

Okay, what we want to do is, at some point, we look at this linear relaxation,

and you say, wow, okay, so I want to cut it.

Okay? So to give you an analogy, you know, you

have to think about, you know, feeding babies, right?

So so there are two types of parents, right?

So when you have a night, you know, a new kid.

So what you do, you have the parents that read the books and do everything by the

rule. Okay.

It's written in the book that this kid has to basically you know, be eating

every three hours. So this poor kid is sleeping but you

know, three hours has gone so they wake up this kid and feed the kid.

Right. And so then try to put the kid to sleep

and that never works of course. Okay.

So these are, you know, these are you know, essentially generating all the

cuts. Okay?

and then they are the cruel parents, right?

So I'm not saying this is better or worse, right.

But the other parents are saying this kid is you know, sleeping, you know

beautifully. I'm going to let this kid sleep and then

when the kids you know, wake up. And yell you don't know why, but give

basically feed that part. That's demand feeding.

That's what we are doing. We generate these cuts by demands.

Okay. So this is what we going to do and that's

called a seperation problem okay. So what you know is that you have this

optimal solution, okay, to the linearly relaxation and what you want to do is you

want to look. A particular cut, a particular class of

cut that's going to remove the ultimate solution to this linear relaxation, okay,

so this is what we do for cover cut, okay.

So, we've this optimal solution to the linear relaxation and we are wondering,

can I find cover cut that's going to remove this optimal solution of the

linear relaxation, okay. So we know that the cover ca-, the cover

inequality, okay? So this is this b's, re-, re-, vir,

saying that the se-, this subset of variables, if I sum them to one, okay,

is, thas, if I sum them, you know, they, they can't all be one, so they have to

sum, and, you know The sum, their sum should be smaller or equal to the

cardinality of the cover minus 1. So that can be rewritten oh at least one

of them maybe 0 so you do minus the variable okay.

So that's going to be 1 if the variable is 0 or 0 if the variable is 0 okay.

And so you're basically saying here is that the number of variables which is 0

is to be greater than 1 okay. And so essentially finding a cover cut,

okay, will require two condition. Okay?

So, we one of you see to find a cover, that's the second conditions that you

have. Okay?

So when you take the sum of this, when you take, when you take the sum of the

elements in C, okay, the elements in the cover.

That is to be greater than, you know, the sum of the coefficient is to be greater

than the right hand side of the constraint.

Okay? And then we want to be sure.

We want to, we want to have the property that in the linear relaxation these guys

are smaller or equal to one, so they are basically violating the cover cut, okay?

So essentially what we want to find is the cover cut that vi, that is violated

by the linear relaxation, okay, and is a cut, okay, is a cover cut, okay?

So we want these two properties. If you find that essentially what we're

doing is generating these cuts on demand. Okay?

So you say ooh, I have this linear relaxation.

Okay? I want to find a cover cut, and that

cover cut has to violate this, has to be violated.

That's what you are trying to do. Okay?

So, essentially, this is you know, we have two constraints here, okay, so we

could view that as a feasibility problem But we can also view that as an

optimization problem because that's what linear programming does.

Right so or you mixed integer programing does.

So what we do here is minimize this expression.

Okay, that's the expression that we want to be smaller or equal to one.

Smaller than one and then we have a definition of a cover cutter so

essentially what you do is you look for these variables that I okay that J in

this particular case okay. So you look at all of them these are

essentially the you know all the variables that indices of the variables

inside the inside the constraints you want to select a subset of them which

violate the cut and minimize this function okay.

And so now if the value of this function is smaller or equal to 1 you have a cover

cut. Okay and all the elements which are equal

to 1 are the cover. So essentially if you solve this linear

program, okay? This this mix integer problem okay.

So you have a cover cap, you find a cover cap.

Now you tell me wow wow wow you guys are trying to solve mix integer program and

you want to solve another mix integer program to solve that integer program and

you want to do that repeatedly you know what are you doing?

Okay, and so let me address that question in a moment.

Okay, so but first let me give you an example okay.

So, so, what you see here is the, is a big constraint I say bigger than the

other time, you know, the other one that I've showed you before, the right hand

sided 178, and you see coefficients go, you know, ranging from 45 to, you know,

125. Okay.

Now, the linear. Assume that the linear relaxation is

giving you this optimal solution, okay. So, you see that the 1st one is 0, the

2nd one is 0, the 3rd one is 3 quarter, the 4th one is 1, 1/2, and the next one

is 0, okay? So, how do you generate.

Okay, so what is the, what is the big, big program we have to solve to find the

cover cut? So this is the, this is the mid program

actually doing, solving the separation problem, okay?

You minimize that one, why is that 1? Because essentially you take 1 minus the

value in the linear relaxation, which is 0, okay?

So that's why you have that 1, then you have that 2 for the same reason.

Then you have 1 fourth of that tree. Why?

Because you see essentially 3/4 a year so 1 minus 3/4 is 1/4 and that's how you get

this objective function. That's essentially what we saw in the

previous slide. And then, of course, you have to have a

cover cut. Okay.

And so you want, basically, that the choice of the value, the 0 1 values for

disease is going to basically be greater than 178 when multiplied by the

coefficient. Okay?

So essentially this is, this is the mid program that we have to solve for finding

this, for finding this cut. So this is the separation problem.

Okay, if we find the solution to this, we will separating, you know, the, we left

final cut, which essentially separate. The, the solution to the linear

relaxation and the convex cell. Okay?

So this is, you know, I told you this is this beautiful mixed integer program, and

so the question that I'm asking you is, you know, does it remind you of

something? Okay?

So it's minimizing your linear sum of 0, 1 variables subject to sum, you know, the

sum of these variables is to be greater than something.

Okay. Well, look, if I do a variable

transformation. Okay, if we write the jen G into one

minus Y G. Okay, what do I get?

Well, this minimum is going to be a maximum.

Okay. So we're going to maximize some of the

variables. And this inequality here, you know, which

is greater than, is going to turn into an inequality which is smaller than, is

going to be a sum of variables with coefficient.

Smaller than something, so what we have there is a knapsack problem.

Okay? So it's like, you know, we have closed

the loop, in a sense. We started with the knapsack and now we

are generating these branch and cut using a knapsack algorithm here.

Okay? So, this essentially the way you can do

branch and cut, okay. So you basically, you basically solve the

linear relaxation, and then you have various class of cuts, and then you say,

ooh, can I generate a cut That's going to separate, separate, the optimal solution

to the lineal relaxation. And you saw this separation problem and

find this guess on demand. So, before we move to the TSP which is

going to be kind of You know, big notation.

So, I want to go into a little bit of history here.

Okay. And we're going to talk about the, the

Seven Bridges of Konigsberg. Okay.

And so is the city where Euler, the beauty, you know, the great mathematician

Euler was living. And this is a very interesting city,

because it is going to have, as the title indicates, seven bridges.

Right, so let's zoom. You see the river, you know, in blue,

okay. And you see all these bridges in red, I

assume. Okay?

And so what Euler asked, you know, I think this is what he was spending his

free time doing. He said, can I find a tour which would

go, you know, which would walk over every one of these seven bridges exactly once?

Okay? So this is, for instance, the tour that

you see here, okay, which is basicaly going over five of the bridges.

Okay, we can go to five of them exactly one, can we go to seven exactly once.

Okay? And so what Euler did is one of the

founder of use in graph theory, is basically represent that is beautiful

pictures. Okay, right?

And then you looked at that picture and said this is really a graph.

Okay. So, I can associate a note with every one

of these land masses. Okay?

And know the edges between these vertices are going to be the bridges, okay?

So, you see that here, you know, I have 2 bridges between these 2 you know

landmass. Okay and so the question was can I have 2

can I have 2 that visit every one of the edges exactly once okay.

So an earlier look at this problem and say no, I can prove that this is

impossible you don't have to try working this thing forever.

Okay, and one of the ways, the way you actually prove that is by looking at

every one of these vertices. Okay, and of, usually when you walk from

one vertex to another, you go to another popular point and then you go to another

one. So, as soon as you enter a vertex, you

have to leave it, right? So, that basically means that if you go

over all the bridges, okay, you have to make sure that the degree of every one

has to be even. Otherwise you will come and there will be

no bridge to, get out of it. Right?

And so he could prove that in this particular case you could never, never

walk exactly once on these seven bridges. Because, you know, these, all these

vertices here at, you know, an, an odd degree.

Okay? So in a sense he invented kind of these

degree constraints. And they are going to be very useful for

our next topic. Which is the traveling salesman problem.

Now, the traveling salesman problem can be seen as the, you know, the dual kind

of, well, it's not really dual from a mathematical sense, but it's basically

the opposite of the, of the, of the earlier two problems.

What we want to do here is to visit exactly, you know, all of the vertices

exactly once, not the edges. Okay?

And there is a big difference between the yellow tour and this, the Traveling

Salesman Problem. One is easy to solve, the other one is

very complicated to solve. Okay?

So, so what we do in the TSP is we want to find the two which is visiting every

one of these vertices exactly once. Okay?

And so the first thing that we have to do now is to try to model this as a MIP.

Right? So when we do one to do branch and cat,

what you do is you model this thing as a, as a MIP.

Okay? So you have to choose a decision

variable, the constraints, the objective function.

Okay? Now there are many models to do that,

right? So, and we're going to use one which is

very, very good from a computational standpoint.

But you see we've got to build it step by step.

Okay? It's going to be kind of nice.

So, the decision variable here, I'm going to be deciding whether an edge is

part of the tour or not. So, we're going look at every edges.

Okay? And we, we'll associated decision

variable, a binary decision variable, depending is whether that particular edge

is part of an optimal tour or not. Okay.

And so the constraints are going to be this degree constraints.

We know that if we go to a vertex, we have to get out of the vertex, okay.

So, and since we can visit the vertex exactly once, okay.

So, it's not like it has to be even, it has to be exactly two, you want to go the

vertex and you want to get out. Okay.

So essentially if you look at the vertex Two of its adjacent edges, okay, exactly

two of these adjacent edges have to be one.

Okay? So this is the first, so I'm going to

kill you with notation now, but this is essentially the basic of the, of, of the

model, of the first, you know, MIP model. Okay?

So I told you I'm going to kill you with these notations.

So I'm going to try to go slowly, so that you can.

Memorize everything, okay? But essentially the decision variables

are going to be very easy. You know, for every edge e, you have Xe,

which is a decision variable, which is zero if you don't get the edges in an

optimal solution in one if you take it. Okay?

No V is always going to be the set of vertices, E is always going to be the set

of edges. Okay?

And then the three notation are important.

Okay, so the three, the next three notation are important.

So, I'm going to use delta V. Okay.

As the set of edges which are adjacent to vertex V.

So, I'll look at the vertex and then I'll look at all the edges which has V as one

of their end points and thus delta V. Okay.

So, when you see delta V, it's the edges associated with V.

Okay. So far so good?

Okay. Delta V the edges adjacent to V Now delta

S for a set of vertices is a little bit more complicated.

Okay, so is the set of edges which says at least 1 of their end points inside

which has exactly 1 of their end points inside S okay.

So essentially they are adjacent to the set S.

You see a set S of vertices and these are edges which are which have an edge

between an element, a vertex of S and an element which is not in S.

Okay. So that's delta X.

So delta S, Delta V the set of edges adjacent to V delta S the edge adjacent

to you know a set of vertices which meets at least one of them okay.

And then we going to give F gamma S, which is essentially.

Oh the set of edges whose both endpoints are inside s, okay?

So this is like, you look at those sort of vertices, and you look at the edges in

between them. You don't go outside, okay?

So, so these are, own, the only notation that we going to use.

Then we have another, you, we going to say x, and then a set of, of, of edges

to, to denote the sum of the variables associated with every one of these edges,

okay. So, when I say x of e1, e2, e3 and so on,

that basically, the, the sum x for the fist edge, plus x for the second edge

plus x for the last edges and so on. Okay, so once we have this notation this

is how we can have a first mip model for the TSP.

Okay, so what we do is we minimize the the length of the two.

So we take the cost of every edges and multiply by the binary variable telling

you if you take the edges or not. Okay, so that's minimizing the cost.

What you see over there basically the flow conservation or the degree

constraints which are basically telling you that you every time you go to a

vertex you have to get out. Okay.

So and this is basically telling you that if you take the set of variable x.

Okay. Okay and, and the set of adjacent vertex,

the, the set of adjacent edge to the vertex v.

Okay. The sum of these variables is, will be

equal to 2. Okay.

Take the adjacent edge. Took the sum of the brilliant fiber

associated with these edges okay. It has to be equal to two.

And then obviously you know the edges have to equal to one okay so you take a

traveling salesman problem use that midformal solution and this is one of the

things that can happen to you. Okay you are going to get a solution on

this. Okay, so this is not a, usually a tool,

what you get is a bunch of sub tools, okay, tiny tools, okay.

So you have this one, okay, so you have this one.

You have this vertex going into that vertex going into that vertex and then

going back there, okay. So instead of going back to something

else, its basically going back to the first.

You know, vertex of this subtour. And then you have another subtour, a

third subtour, and so on. And obviously, you know, this is very

clever, right? So because you don't have to travel these

long distances between these things. Okay?

So that's terrible. We won't though, and one down.

So what we want to do is we want to look at a particular vertex like this, a

particular subtour like this, and we want to say, ooh, can I remove the fact that

this is a subtour? Okay?

So if you look at that it's three vertices and you see that, that you know,

you ask yourself the questions that how many edges can actually be selected

between these vertices, okay? And adversely you can't have three

otherwise you have a subtour, okay? So the maximum number of edges here for n

vertices is going to be n minus 1, right? So in this particular case it's going to

be two, okay? So, and these are called the subtour

constraints. Okay?

And so what I'm showing you here is a formulation and a better formulation, you

know, for the TSP, where we put these sub two constraints.

Okay? So what we are basically telling you here

is that if you take x and you so, if you take a particular vertex, a particular

set of vertex s. Okay?

And then you take all the edges, you know, which are inside that set.

Okay so that's gamma s. Remember gamma s are all the edges which

are between a vertices of s. You know that the sum of these edges, the

sum of the Boolean variable, are associated with these edges has to be you

know smaller than the communality of s. You know the set of vertices minus one.

Okay so you can't, you have to get out of there.

Okay? So that's what is this basically saying.

You can have, you know, S minus, canonality of S minus one edges, but you

have to be able to exit. Okay?

So that's what you have there. Okay?

So these are called the sub two constraints, okay?

So, you look at every subset and you place these particular constraints.

Okay. But what is the issue with these sub two

constraints? Well, once again there are exponentially

many of them. So, you don't want to write them all

down, right? Because now it's going to be kind of

long, okay. So what you want to do is exactly what we

did with the cover cuts, you want to generate them on demand.

Many of them you will never generate. Why?

Because you know, you take something, you know if you do a tour of the US You never

do caller connect Orlando and Seattle, right?

And then San Francisco. So, you know, generating a sub tool for

this thing is completely useless, it's use, the, the, the relaxation is never

going to do that. Okay?

So what you want to do is look at these sub tool constraints and see which one

are violated by the linear relaxation, okay?

So, you want to generate them on demand. Okay, and to do that you need a

separation problem. Okay, so how do you do the separation

problems. I'm going to give you an intuition, I

won't go into the detail, but it is beautiful, okay.

So what we, we can re-express the constraint by saying that, you know, the

set of the edges The set of the adjacent to us between a set of law to adhere

one.The cutting is to be set at minus one we sit on.

And this is what you see. I took a set yes and then I set a s which

is And, and exactly one of the endpoints is inside s, and so what I'm basically

saying here is that the sum of edges when one of the endpoint is in s, the other

one is outside, is s to be greater or equal to 2.

Okay? Which means, oh, I have to get in and

then I have to get out. Okay?

So that's what you, you can re-express it that way and once you re-express it that

way, the separation problems become a really nice combinatorial problems that

can be solved in polynomial time. The key idea here is that you will build

a graph. Okay.

With the same set of vertices, the same set of edges.

Okay. And the weight of the, and so for every

one of these edges, you are going to assign the weight of the, the weight of

the edges is going to be the value in the linear relaxation.

Okay. So, so, basically build, you know, you

have the same graph but now the edges get a width which is the, which is

essentially the value of the decision variables of that edge inside the linear

relaxation. And now if you, to separate, to find the

sub two constraints, ok? That is violated, what you're going to do

is try to find a min cut in that graph. So that means cutting the graph into two

parts, okay? Such, and, and, such that, you know, you

know, that, of minimal size, okay? To minimize the number.

Of edges between these two different subsets, okay?

And if then the weight of the cut is smaller or equal to two you have a

problem, right? So you are basically violated in the

subtour constraint, okay. So you have isolated the subtour

constraints between some of these vertices and you can, you, you can find

the subtour constraint and stand it inside the linear relaxation.

Okay and finding some of these cuts okay. So finding these cuts is polynomial time.

You know the simpliest algorithms which solves a bunch of max flow min cut

problems and generate the cut by them. But there are better algorithm to do that

acutally in practice. Okay so now you know to separate these

subtour constraints so can do this branching you.

Add them until you can't actually find any subtour constraint which is violated.

Okay. Does that mean that at that point you

have a solution to the TSP? No.

Okay. So why this is a solution.

Okay. Which satisfies all the subtour

constraints. All the degree constraints.

Okay. So you can't imagine how long I took to

generate this. Okay.

And, but it violates some of the subtour constraints.

Okay? It vi, well, it, it, no.

It doesn't violate the subtour constraint, but it says a fraction of

solution. Okay?

And they're actually cycles. Okay?

So what you see here, look at this part. You get a cycle here.

Okay? So all the values are one half.

When you sum, okay The sum is smaller or equal to 2.

Okay. It's 1.5, so it's smaller or equal to 2.

Which is fine, fine, you're not violating any of the sub 2 constraints but still

you have a cycle and you have fractional values there.

So, what does that mean? That means that, you know, the convex

side of the TSB is not characterized by the sub 2 formulation that I've shown

you. We need more of these cuts.

Okay and so, you know, in, it's the whole industry to actually generate very good

cuts for the TSV okay. And there are some beautiful results in

that space. And so I'm going to, I'm going to, I'm

going to give you one class, okay, which is very useful in practice which I'll

call the comb cut, comb, you know like combing.

Okay, and so the basic intuition here is that if you look at the two and take a,

you know, kind of an of a line. Okay.

And wonder how many of these edges are intersecting.

You know, you are going to intersect an even number of that, okay?

Whatever you do is always going to be an even number, okay?

So, now you can start reasoning about these crossings and that's what comb cats

are doing. You know, if people in optimization are

really created in finding good Okay, so what is a comb cat.

Okay, so this is something that is a handle and a bunch of teeth, okay.

And then you have nodes, where you have edges, which are purely between elements

of the handles, and then edges which are purely inside the teeth.

and then you have edges between the handle and the teeth.

And the key idea is that what you want is what the comb is going to do is bond the

number of edges that you can have inside the, purely inside the lid because that