The reason is because a little of randomization leading to some long range

link is not enough to destroy an average quantity. and therefore. you still

maintain the large clustering coefficient in a regular graph.

And yet is suffices to shorten some path between any two node and that's enough to

reduce the shortest the distance and, l is only about the shortest the distance.

Most of the path are still long, that does not matter.

We now got some shortcut long range lengths going from this to this for

example. So, instead of hopping all the way,

locally I can have one shortcut to put me very close to the destination.

So, I don't mind if the other long paths are still there, because l is only talking

about the median of the shortest path. And therefore, you may wonder, ,, since

that's Watts Strogatz model works to explain structural small world,

Why not we make L define L as the median of the average distances between nodes

pairs rather than the median of the shortest of distances?

Then it will be a fair definition, fair in the sense L and C will be standing on the

same footing. Well it turns out that we don't need to.

Why don't we need to? Because we don't need all the paths or the

average of the path, linked between two nodes to be short.

When we route, as long as we can find a short path and a smart enough routing

method, it's going to pick out that path. If the functionality implicitly modeled

here is routing and all we need is some short path.

But that means, beneath this definition of l is a very basic assumption that says,

you can actually have a routing algorithm that can find, such a short path.

Maybe not a shortest, but a very short path.

Now we just talked about a greedy routing method on social network.

We call that greedy social search based on some composite distance metric scores.

And let's say if we run this kind of greedy routing method on a social graph,

what is the expected length? Let's denote this by little l.

Big L is talking about the existence of short path.

Little l is talking about the ability to route according to a greedy methid and end

up with a short path. This is existence result.

This is a computability or searchability, navigability result.

It says that local information used in greedy fashion can lead you to some short

path. If this is true, If this little l is also small, maybe not

exactly as small as big l, then we're okay cuz it says that, you know what, I don't

need to define l as the median of average distances because shortest the distance is

going to be discovered, those paths would be discovered in a practical, greedy,

distributed, routing method. The question is, is that true?

Well, it turns out that is true as observed in many cases.

Now, the question is, how can we reverse-engineer that searchability

result? In other words, how can we have an

explanatory model to generate an, an algorithmic small world?