0:01

Let's examine some more associative operations.

We have seen the definition of associativity,

as well as the consequence that for

associative operation to expressions with the same list of operands will

evaluate to the same result regardless of the parenthesis that we insert.

0:36

But to make these observations useful,

we need to have examples of relevant associative operations.

When we discuss associative operations,

it's important to differentiate associativity from commutativity.

0:51

Commutativity just means that for binary operation,

we can swap the order of two operants.

Now while there are operations that are both associative and

commutative in general, these two properties are independent.

And in fact, there are associative and not commutative operations as well as

operations that are not commutative but not associative.

And keep in mind that for correctness of reduce,

what we need is precisely is associativity.

Here are some examples of operations that are both associative and commutative.

1:28

Many of them come from operations of numbers, addition and

multiplication of mathematical integers which we can faithfully represent

a BigInts as well as exact rational numbers which we can represent.

As for example, pairs of big integers are associative and commutative operations.

Moreover, if we perform addition and

multiplication modules on positive integer, such as 2 to the 32.

And this includes then usual arithmetic operations on

int and long data types in scala.

And we also obtain operations that are both associative and commutative.

Another important class of operations that are associative and commutative and

have many other properties are operations of union and intersection.

Symmetric difference of sets is also an example of such operation.

When we manipulate collections, often the multiplicity of elements

is also important that is whether some element appears multiple times.

We call such view of collections bags or multisets.

2:45

When we perform an operation such as union of bags preserving the multiplicity

of elements, that terms have to be also both commutative and associative.

In logic, Boolean operations such as conjunction, disjunction or

exclusive or or also both associative and commutative.

Furthermore, operations on structured objects such as

polynomials have also commutative and associative addition and multiplication.

Addition of vectors is furthermore associative and commutative, and

so is addition of matrices of some fixed dimension.

3:33

that computes the set of array elements raised to the power p.

Which combination of operations does this expression correspond to?

Now that we've seen operations such as map and reduce.

We can express the value of the above expression as taking the array

a then doing map on this array with the operation

that takes actual value and then raises each to power p.

4:18

This is justified because plus is in fact associative You may recall,

from our implementation of array norm that we only had one traversal of an array.

And this is because this map can be, in fact, combined with reduce.

Because map simply applies a given function to every array element.

5:34

These are two distinct lists that happen to have the same length but

the order of elements is not the same.

Of course, the related example is concatenation of strings which can

in fact be viewed to some extent as lists of characters.

6:06

In general of course, it may not be the case that B

times A is even well defined when A times B is well defined.

Analogous situation arises with composition of relations and

composition of functions.

Composition of relations collects all elements that are pairs of a and

c such as this element b or

ab belongs to the first relation in bc the second relation.

6:35

And compositional functions can be viewed as a special case of that occasional with

a different conventions on the ordering of the operands.

For all these operations, because they are associative reduce gives the same

result as for example reduce left, even though they're not commutative.

As a warning, there are operations that are commutative but

not associative here's one example.

7:08

It is easy to see that x and y play a symmetric role.

So f(x,y) is x squared + y squared whereas f(y,x) is y squared + x squared.

On the other hand, if we compute the two sides of the associativity law,

on the left side we get x squared + y squared + z squared.

And on other side, we have x squared + the square of y squared + z squared.

And these two expressions are easily seen to be distinct for many values of x,

y, and z, because for example, the degree of x

7:48

on the left hand side is 4, where it is 2 on the right hand side.

So because of such examples it is important to keep in mind that proven

commutivity alone will not be sufficient to imply associativity or

that reduce gets the same result regardless of the order of operations.

In general, if f is a commutative operation and

h1 and h2 are two functions of one argument,

then the function g defined by applying h1

to each of the arguments, then applying f and

then applying h2 to the result remains commutative.

Indeed g of xy is equal to this expression.

But if we swap the arguments of f,

which is commutative, we obtain the following expression and

this is in fact g(y,x), so g is commutative.

As we have seen in the previous example however,

g loses the property of associativity even when f did have it.

The previous example was an instance of this phenomenon, when

9:06

h1 was x squared and h2 was identity function.

So when combining and optimizing combinations or reducing map invocations,

it is important to keep in mind that applying such functions

to arguments of some associative operations may lose associativity.

10:08

X is a very large positive value, and mx is a very large negative number.

Now, x + mx will be 0 and when we add e,

we will get this very small positive number e.

On the other hand, what happens when we compute things in different order,

with different parenthesis?

Well, mx + e unfortunately cannot be represented precisely.

And because we do approximation, the value of e will essentially be lost.

So this expression will evaluate to mx when we add x and

mx, the result will be 0.

And this is why we obtained violation of associativity law.

11:30

Then e times x is in fact 1,

and when we multiply it by x the result is x, which is a very large number.

On the other hand, x times x is not representable at all,

because it's too large, so the result is infinity.

11:57

the previous operations were in fact commutative.

So it is interesting to think why it was possible

to make such approximate operations commutative yet

the designers did not succeed in making them associative.

Well it turns out to be much easier to make an operation commutative.

Suppose that we have some operation g that is not necessarily

commutative, but we are happy with the answers g, y, x or g, x, y.

Then all we need to do is pick some total order on the values from our domain,

such as x and y, that allows us to compare two values.

That always we get the answer that they're equal or

one of them is less than the other.

We can do that by for example, comparing raw bit representations of

these values although they're for every domain better ways to do that.

We can then define function f such that it computes

those among the values g(x,y) or g(y,x) for

which the first argument is less than the second.

It is straightforward to verify then that such slightly modified

operation f is in fact commutative even if g was not commutative.

Of course, when the two arguments are equal then it doesn't meant or

whether we compute g(y,x) or g(x,y) the result is equal to for example, g(x,x).

And if we have that y is less than x in our ordering then,

the left side is g(y,x).

Because we execute the true branch or

as when we compute f(y,x),

then we will execute the else branch,

which will lead to exactly the same invocation of g.

So regardless of what the value of g is, will obtain the same value for

f(y,x) and f(x,y).

So they're are some ways of patching functions to make them commutative,

but we do not have such trick for associative of functions.