Sometimes, television can do a better job of reflecting hard truths than real life.
Let's watch some video clips from the CBS series The Good Wife.
This is a series set in Chicago and it's law oriented.
The point that we want to look at is this company that they feature called Chumhum,
which is a large information technology company,
sort of Google like,
in terms of what it does,
and specifically maps product that it has put out.
Let's watch.
We used to book up weeks in advance.
Our venison potpie was written up in The Trib.
We were Zagat rated, then people just stopped coming.
It's hard not to take it personally when it's your baby.
And why is Chumhum responsible?
Their maps program, Chummy maps.
It's supposed to help users stay safe by giving them
directions to avoid the dangerous parts of the city.
Green, safe, yellows, so,
so, and red means stay away.
When the filter's on, driving directions avoid the red areas.
The filter is always on by default.
You have to manually turn it off.
So, you lost your foot traffic.
And this. Watch when I toggle the filter on and off.
Doesn't show businesses in the red parts of the map.
When this came out it was like I didn't exist.
It says this is an unsafe area.
Yes. A red zone.
Doesn't feel unsafe.
Yes, except for one thing.
Too many people of color.
What we just saw is a case where there is
real life impact on this particular restaurant owner on account of
something that Chumhum created in their product as a service to its user community.
Let's look at the next clip to see what Chumhum was thinking when they did this.
Our maps are not racist.
It's run by an algorithm and math is not racist.
Chumhum is very proud of its diversity and openness.
And yet, their safe filter deters people
from patronizing businesses and African-American neighborhoods.
Chumhum is not responsible for the actions of those who respond to its software.
It is Ms. Feldman's choice to open
a four star restaurant in a marginal neighborhood, not ours.
It is a marginal neighborhood.
It's a black neighborhood.
It's both, which really isn't that uncommon.
What?
He's not saying anything that unusual.
Look at crime rates, look at home values.
And look at the biracial lawyer pimping for a racist system.
Excuse me, are you actually [inaudible].
Let's not make this personal. Wait, wait, wait, wait, wait, hey, wait.
Chumhum will admit no wrongdoing but in the spirit of making this world a better place
Mr. Harmon is willing to offer Ms. Feldmann
$50000 to open a new restaurant anywhere she wants.
Consider it a gesture.
Which gesture is that?
The finger? Chumhum is worth $300 billion.
I know how much my company is worth. Thank you.
Is that because your stocks dropped 22% since you became COO?
Okay, you don't need to go there. Why do you need to go there?
Ah, seriously? [inaudible] You know what? You know what? No. We're done.
Your guy's a piece of work.
Unfortunately for you, the law's on his side.
We'll see.
You're suing Chumhum for tortuous interference with prospective economic advantage?
Yes, your honor. Chumhum's safe filter creates
a racist geofencing effect that
disadvantages business owners in African American neighborhoods.
Your Honor, tortuous interference requires proof of
either purposeful or knowing interference. And there is none.
We believe there is, your honor.
I'd like to call a witness to show that the intent requirement is not met.
Mr. Harman, is it possible for Chumhum's safe filter to have a racist intent?
No. The filter is powered by two things and two things
only: objective third party statistics,
crime rates for example,
and user generated feedback.
Like comments and reviews.
Yes, exactly.
Which is entirely in the user's hands.
I mean, there's no way that anyone at Chumhum could use
the same filter to purposefully interfere with plaintiff's business.
No, not even if we wanted to.
Thank you.
Mr. Harman, is it true that Chummy maps is only available on COS,
Chumhum's own operating system?
Yes.
This is a recent study done by the Internet Research Foundation.
Could you read what it says, right there?
COS users are 71% Caucasian.
And what percentage of COS users are African-American? Does it say?
Eleven percent.
And do those numbers match your own market research?
Chumhum is not to blame for socioeconomic disparity.
So, we saw here that there is
a social impact of the decisions that this algorithm from Chumhum,
the decisions that this algorithm makes.
The question is, what kinds of inputs are reasonable for Chummy maps to consider?
If it is objectively considering,
let's say, only crime statistics,
and just objectively saying these are neighborhoods with higher crime, is that okay?
Or is that not okay?
I don't know the answers.
This is a question of having an overall social consensus over what is okay to say.
Potentially, let's say that that is considered reasonable.
There is then a different question, which is,
is it okay for Chummy maps to consider as input what people think is a safe neighborhood,
as opposed to starting with actual crime statistics?
And now, this is a subjective assessment of I feel safe in this neighborhood,
which depends very much on who the speaker is.
And the problem that we saw in this episode
is that the subjective assessments that Chummy maps is
aggregating is subjective assessments with a highly skewed user population,
with the majority of the assessments coming from Caucasians,
and therefore reflecting whatever racial biases
there might be in terms of personal assessments.
So what's a safe neighborhood?
This kind of hidden racism in algorithms can show up in many other places.
The next clip is a riff on
an actual incident that took place with Google's image recognition a few years ago.
When he typed in his name, Jamal,
the autocomplete offer to finish the sentence with
Jamal stole my car or Jamal arrest record.
Okay, this is not good, but it's non-responsive.
Really? How do you figure?
It's customer based. I mean,
the autocomplete is based on the most used search requests.
So, obviously, more people who typed in
Jamal were interested in arrest reports than anything else.
So, Chumhum's not racist; its users are?
Well, one could argue that it's racist to override the results.
Okay, we need to deal with the animal incident.
I don't like the sound of that.
Chumhum has photo identifying software.
It sorts your photos into categories.
Camping trip, Disneyland, graduation.
One of the customers pointed out that it had tagged photos of her friends.