One of the biggest issues you're going to to deal with if you're going to start doing predictive analytics is dealing with some of the organizational issues. These can be politics. These can be technology. But ultimately remember what we're doing. We're taking data, big data if you have it, we're estimating some models. That requires technology. And then based on those models, I'm going to start changing my organization. Guess what, some people are going to win, and some people are going to lose. You may find out that an intuition or a hypothesis you had is completely wrong. The person who has basically made their career based on that intuition is not going to be real happy on this. So here are some of the issues that you're going to need to think about if you're going to start doing predictive analytics. Okay, some of the common reasons why companies have trouble doing this, right. Insufficient model development, right. You can't do data mining, which has a lot of benefits, doing data mining, but ultimately what you want to know is, right, what leads to what? And what actions can I actually take as a company to do this? So even if you do the data mining, you need to, again, start doing this peeling back the onion, which means you need to think about what these linkages are. From my strategy, what do I think this causal business model is? How is it that I'm going to implement this? Well, I think our A is going to lead to B is going to lead to C. Let's see, but you need to lay that out, right? The other problem I've seen in a lot of companies is, they will ask me, okay, what's best practices? Or, I saw this benchmarking model or this generic measurement framework, like the balance score card. So I'm going to use that to pick my performance measures. Well, there's problems with that. Again, strategic advantage means you're doing something different than your competitors. So the last thing I want to do is benchmark myself and do exactly what they're doing. The other problem with generic measurement frameworks, they're generic, which means it says, okay, everybody should be doing this. Now what you want to do is tailor both the measures that you track and the analyses you do to the strategy of your company. So if I think I'm going to compete on a different dimension I do not want to do what the other guy did. So this means there's going to be a lot of up front, starting with your strategy, to decide, here's the analytic models I want to estimate. It's not purely statistics. And based on that, what are the actions that we as a company can take? Another problem you have is we are taking measures, right. Measures can be good. Measures can be really bad. A couple of reasons why you might have measures that are bad are what are called psychometric properties. One, does it really pick up what you claim it's picking up, right? You're trying to estimate some construct and I gotta find some way of measuring this kind of intangible thing. Well, is that measure you have, is that really picking up that intangible you care about? And the other thing is, is this influenced by so many other things that it bounces up and down all of the place? And I have no idea whether that means you're doing well or not doing well. Now, this really becomes a problem when you start using surveys for questions. One of the problems is you may have too few questions, right. Like, how satisfied are you? A little, a lot? And that's the only question I ask you? That doesn't help me much afterwards because even if I find a relationship, I don't know which dimensions of satisfaction you answered about, and I don't know what to do as a manager. Another one with scales is you have too few scale points, right, you're using one to three. I'm not satisfied, I'm very satisfied, or I'm in the middle. That doesn't really tell you a lot. Or you use what's called the top-box measure. Say you're satisfaction scale goes from one to five. What percentage of the people are at five? That may be fine, as we saw in some of the examples. In other examples it might not be fine. You need to do the analysis of whether moving everybody to the top of the box even makes any sense. But again, what you would like to have are measures that have what's called good signal to noise ratio. Signal means it's responsive to managerial actions. When they take an action I like, it goes up. When they take an action that's not good, it goes down. So it's responsive. Noise means it's not affected by all kinds of other stuff that are outside your company's control. It's just not bouncing all up and down. So if you could pick measures or analyze measures that have high signal, they respond to the actions we take, and low noise, they're not affected by all kinds of stuff outside our control, that's going to help you when you do the statistics because it's more likely you're going to find the relationships that are really there. Some other reasons that people have trouble doing these analyses. You're measuring the wrong attributes. Again, customer satisfaction. You can ask me do I like that the facility? Well, I could answer that but that's not a question that has anything to do with whether I'm going to come back or not. Right, so you have to make sure again when you do the analysis, when you have these measures, are you measuring the things that actually impact behavior? So again, that's peeling back the onion. Some other problems are really organizational, right? Those are really kind of, are your measures good or are your measures bad? Some of it is the fact that, it's not that we don't do analytics in companies. We do analytics. Some's more rigorous than others, but we do it in little pockets. Or as we call them, islands of analysis or strategy silos. Everybody's doing their own little analysis, the marketing guys, the strategy guys, the operations people. Nobody ever talks to each other, right. You're all making action plans based on your own analyses and don't look at how are they interdependent, or even how does one analysis in one part of your department impact the analysis done by the other part of the department. Again, strategy is trying to tie all this stuff together, so you need to think about how can we do this? The other thing is you need to figure out what is your intuition in the first place? You may not know this or think about this, but you do have hypotheses about what's going to make your company work. That's called strategy, right. You do have intuition that, if this happens, I think something else is going to happen. But a lot of cases we never actually write that down. Well, to do the analytics and to figure out which model I estimate, you should write down, what is the intuition? What do we think is driving success? What are our hypotheses here? Spend the time up front trying to figure out what are the relationships you want to to test before you just start going off and doing the statistics on this. Another problem is as companies say, we have lots of data, no information. Data's really not a problem. Think about most companies. We got more data than we know what to do with. It's getting even worse now, because storage costs are so low. We save everything. But what you need, again, are the resources and the appropriate skill sets to actually analyze it, to turn data into information. Now skill sets does not mean you've got all these great statisticians. Yes, you need that. The other thing you need, though, is the business people that tell the statisticians, here's what we need to estimate, or here's what we need to know. Because ultimately, the predictive analytics need to be turned in to action plans. And that's where it requires some mix of business skills and statistical skills on there. The other thing is you need to dedicate resources to this. It's not like most of us have time on our day to actually do analytics. Where are the resources going to go? And to do that, ideally what you'd like to do up front is what I call proof of concept. Pick off a small analytics project, do it, show that it works. Pick an area where you're pretty sure, I can learn a lot on this. And once you do that you're going to start getting the resources committed to it. Instead of saying I'm going to analyze the whole company at one big fell swoop, start out small. Learn how to do it and show proof of concept. Okay, now we have the other issue, which is technology. You could have the greatest database in the world. Okay, we went out and we bought Oracle. We bought SAP. We have a relational database that, in theory all the data's supposed to be in this relational database, in theory. In practice, though, how you can actually access the data depends on how you coded it. Okay, and here's the problem I've had in companies. You get in there and then they say, we want to link up employees to customers to did the clients decide to keep our contract? So that's fine. You've got data on the client, did they renew or not, right? You've got data on customer satisfaction stuff. You've got data on employees. Well, it turns out the employee stuff was never coded up to the client level so I can't match that up. The customer satisfaction thing was for the whole company but you have 14 different projects with that company, and they're only doing one at a time. You need to think in advance, how are you going to code the data before it goes into the database? And think forward. Think about four, five years from now, I may want to do analytics. Make it as granular as possible when you put it into the database. You can always roll things up. You can never roll it back down. The other thing you should check on is when somebody says it's something like defect rates, are they defining it the same way? We've gone into companies where, and here's an example from an auto plant. Two different auto plants in the same company. Each one define defects differently. So trying to match those up when they didn't even find defects the same way, that's going to be hard to do. So that's a problem that you have on this. Another big problem. Remember ultimately we're trying to predict financial outcomes. The problem is most of your financial data is going to come out of your accounting system. Some of the outcomes that you want to predict are not things that naturally come out of an accounting system. And one good one would be customer profitability. Not how profitable a product is. Not how profitable division is. For this customer, am I making money on that customer or not? Well, that's not naturally a way we gather data in an accounting system. So until you can start actually gathering that financial data that way, it's going to be hard to predict the model you want. Now it may be the case you say I want to do this in the future, and we start tracking it that way. But that isn't limitation. Finally, politics is everywhere. Okay, you need to worry about things like data fiefdoms. Data is strength. Different parts of companies don't like giving up data. How can you get different functions to start sharing data, to actually doing this analytics, linking non-financials to financials? I need the finance people, the marketing people, the operations people to start sharing data. This is not always very easy. We've had companies where we had to go to the very highest levels of the C-suite to actually get different functions to share the data. The other one is, what do you do if your intuition doesn't appear to be true? Here's somebody who's made their career based on their intuition. And now you tell him, hey, look it, [LAUGH] maybe that intuition's not right, right? A lot of people don't want to know the answer. I've staked my career on this relationship, it better be there. All you can do is show me either I'm right, which I already know I'm right. Or you're going to tell me I'm wrong, which I don't want to know. I do not want to know the answer to this thing. You've gotta worry a bit about the politics up front and the organizational power issues. Is this going to shift when you start saying, the money should be invested here versus there? What's going to happen? So given those issues, this can be a little tricky. But it doesn't really matter. I mean, ultimately, this is where it's going. We're going to start doing analytics, like it or not. You need to at least think about the politics up front. But ultimately, we're going to do it. So here, to conclude, here's the key questions you want to ask. First, what's the firm's business model in the first place? Forget the analytics, start out with your strategic plan, right. Start asking yourself how, specifically, is this company or this business unit expected to create value for our organization, specifically? Lay out that causal business model, right. How is it that A is expected to lead to B, is expected to lead to C? Now, once you've done that, now you can say, if that's the model I want to test, what data do we have that are currently available to test these value propositions? As I said, most companies data is not a problem, information is. My recommendation is don't reinvent the wheel. Don't start tracking new data. Try to find data that you currently have in the organization that's close enough to what you need, and start testing these relationships. Don't start gathering new data because there's probably some there already. Ask yourself, what is the desired economic outcome you care about? It doesn't have to be profits. It could be revenues. It could be revenue growth. It could be, if you look at contracts, did we win the contract? Did we not win the contract? It could be retention rates, right. What are the economic outcomes? And based on that, you can start gathering the data and doing the analysis on this. A really big question to ask yourself, what's the appropriate unit of analysis? It may not be whole firm. In fact, it may be very hard to do this at the firm, right? Is it an office? Is it a plant? A region, right? As you've seen in some of the other videos, it could be customers where we do customer analytics. Could be a product or service, a program or initiative. Figure out for the analysis you're trying to do, what is the appropriate unit of analysis there? It doesn't have to be low-level customers. It doesn't have to be the whole company in total, right. So given the analytics and which you want to answer, that could change. And finally, you would like to make this an ongoing process. Not we did it once, here's the answer we got. Strategies change, competitors change, the business world changes. You need to keep doing the analytics to say, does this relationship still exist? How is it that you can set up an organizational mechanism to ensure that you have this ongoing analysis? To conclude, let me give you an example of what some companies are doing. What they do is set up quarterly meetings with all their high level executives. In each one of those meetings, they lay out some hypotheses that we would like tested for the next meeting. Then the analytics group goes off and they test these hypothesis. They present the results in the next quarter. I guarantee you that when you present the results, a whole bunch of new questions start coming up, which lead to the next test of hypotheses that we're going to do on there. And by doing that, you've got an ongoing mechanism of updating the results. And again, peeling back this onion til you get to the point where you can start answering these questions and ensuring that this predictive analytics becomes embedded in your company, as opposed to be coming just a one off exercise that you're going to do maybe this year, maybe next year. No, this is an ongoing process. As the world changes faster and faster, you need to set up an ongoing mechanism. And with that, I'd like to thank you for listening to this. And I hope you can take some of the things I've taught here and use it in your companies. Because predictive analytics can be an incredibly powerful tool, both for figuring out how your strategy is working, and more importantly, figuring out what's the biggest financial payback when you starting linking these non-financial measures to the financial measures.