[MUSIC] So this is an example that begins with a barrier. We talk about barriers a fair amount. And here the payoff is going to be really big. So this is a site that was put together as part of the marketing strategy for a video game. It's a video game that takes place here in Chicago, and part of the conceit of the video game is that there is all of this data out there about us as individuals that could be hacked into, tapped into and exploited in some way or another. >> So, with that in mind, you said a moment ago we've talked about barriers, what is the barrier? >> The barrier here is that they're asking you to log in with Facebook. This can be very off-putting because people are nervous about giving you access to Facebook because Facebook data is very private, personal data about you, about your network. >> Even though Facebook is public, I can't see much of what you put on Facebook unless you let me see it or you decide that that part of my Facebook account is open. >> Right, but all of that data- >> Exists. >> Exists, and Facebook has access to that data. So by enabling this I'm, I'm enabling this website to access that data as well. >> All, all my entire Facebook account. >> Your entire Facebook account. All of the data that's associated with you. And here, they're using this in a powerful way, because this is designed to show you exactly how much is known about you. >> Okay. >> So here, the ask is pretty big, but the payoff is also pretty big, because it's meant to be illustrative. It's meant to show you exactly how exposed you might be. >> So, lets go. >> Let's do it. And it's loading all of my personal information here. >> From Facebook? >> From Facebook. >> Okay. >> And it starts to give you this view of who you are. It pulls in photographs of you from various moments in time. And it also begins to show you your network. So people who interact with you more than you interact with them. People who tag you in posts considered to be liabilities. People whose pages I visit often, but who don't necessarily reciprocate. Could be my obsession. >> So these terms stalker, and liabilities, and scapegoats, which could be quite negative, those are not, those are not your terms, those are the program's terms? >> Yes. And this is, it's, it's mirroring the, the game itself. Right, so it's giving you, you, the point of view of an assassin, or someone who is looking to exploit you and your network, via your personal information. >> And we assume that assassin in the game has categorized the people that you actually love or care about into different categories. So this is imaginary, but suddenly someone you love is a stalker and someone else is a liability, etc. >> Yes. >> Got it. >> Right, so it's, so people who I interact with more than they interact with me, I would be perhaps obsessed with them and they don't reciprocate. People who are considered to be liabilities, these are people who tag me in posts frequently without my permission necessarily. Coming on down it has a [LAUGH] an analysis of the words that you use. My, my most popular word is baby. >> I have two of them. I apparently talk about them a lot. And what, what's really interesting is when it starts to get into when and where you post. So we know when you're vulnerable. It says you're most active on Monday evening from 7 to 8 PM. This is after I put my kids to bed. I'm probably at home, sitting on my couch, logging on to Facebook. So it's meant to sort of show you, and I, you know I think this is eye opening for me when I first did this. >> I'll bet. >> Because I thought, oh wow, I didn't realize that all of this information was out there, and that every time I give a game, or some other program access to my Facebook information, that this is part of the package. >> And then we know where to find you. >> Yes, you know where to find me. This, this is least accurate because of my settings because this is, this is in Georgia. This is a strip mall. >> So this is a strip mall in Atlanta, Georgia, where you grew up. >> Yes. >> Not in the northern part of the US, Evanston, Illinois, where you live. >> Exactly. >> Got it. And it does give me a 2% probability, but they never check in. So a lot of the geolocation data is not valid. But we come down here, it knows my age, my occupation. It has an estimated salary. >> So I assume since you're a professor at a university, it's taken some public data about professors, and guessed. >> Yes, I would hope so. >> Mm-hm. >> It doesn't, you know, I, I don't have deep knowledge into exactly what type of analysis they are doing, but I do know that all of this is coming from Facebook, all of this data. So Facebook is likely the company that is doing the analysis on this. >> So they have an algorithm that says this is what you might be making. >> Right. >> And then there's this whole area about, we know your secrets. >> Yes, we know your secrets. They're plugging in different words that might be associated with me. And you know, here it's coming down and giving us, finally. It's relating back to the brand. It's saying, you know, we know who you are and here is the purpose behind this. And what was interesting about this piece is, you know, this is part of the story, but it's all the way down here in the bottom. And the reason that people were sharing this is because of the intrigue of the story which is based around the story in the video game, but it's not really about the video game. >> So, my payoff is that I get to see what a computer and what the Internet and what algorithms could create about me. That's my payoff. >> Well the, the payoff is, is showing you what information exists about you. >> Yeah, and, and their payoff is that I see this anchor at the end which says hey, the world out there is like this. Play Watch Dogs and you'll learn more, you'll have more fun. You can enter that world. >> Yes. >> And, and it's really a generalized lesson, which is, increasingly all kinds of organizations will trade your data for something you want. [MUSIC]