We're going to take a few minutes here to talk about embedded interface components. I'd like to consider a little bit about how humans senses align with computer interactions or device interactions, and contrast and think about applying different types of components. We'll look at some of the strengths and weaknesses and attributes of these components and talk a little bit about how you select them. Again, this is a foundational topic. It doesn't really apply directly to analysis and planning, it's an overall topic. When you're thinking about creating embedded devices, you have to really think about what your toolkit is for putting those designs together. I think just like with the earlier review of cognitive psychology, it's good to plant this one early and start to think about these basic considerations and what your tools are for putting together a device that you're trying to create a compelling user experience for. How does a computer or a device see us? This little drawing is fairly famous for how a computer would see a person. The primary interfaces that it provides are touch, which is often just a single finger site, and some sound. That's a subset of all of our senses: touch, sight, sound, taste, smell. We also have movements, speech, how we move our entire body, skin responses, heart rates. Also in more modern systems, voice is becoming one of the vectors of getting a computer or device to do something that we want it to do. Certainly, you won't see a nose or a mouth on there for smell or taste, those don't usually come into play for most devices. When you start to think about individual device inputs, we start to talk about things like touch using physical controls or touchscreens. Movement manipulation with some tangible user interface. Speech for things that recognize it. Body movement for things that recognize them. Galvanic skin responses, if we're looking for stress or temperature changes. There's the growing look at direct interfaces to pick up thoughts and changes are heart rate to look at stress or anxiety, or whether you're sleeping. More traditionally, there's the outputs from devices, seeing, hearing, sensing vibration through LEDs, sound, force feedback, etc. Smell is not so often done. Again, taste doesn't matter usually. Temperature perhaps could be used. This is a list of the typical elements for an embedded device interface. A lot of this discussion comes from the connected products development, that we talked about earlier. Physical controls like switches, buttons, keypads, maybe capacitive touches on membranes, different lights, LEDs, bulbs, varying or color intensity, blinking, different screens, possibly with some touch sensor. Very nice because of the ability to change them on the fly. Do fairly complex interactions. Audio output, speakers, buzzers, tone generators. Voice user interfaces now, Amazon, Alexa, Apple, Siri, custom applications you might come up with. Using gestures or a whole body movement with things like the Microsoft Kinect. Tangible user interfaces where devices react to how close they are to another device or how they're positioned next to another device. Tactile output, where the device may actually change how it feels or it might vibrate. Context-sensitive devices,they might change their behavior based on where they are. Then computer vision, where you're scanning codes or bar codes, or doing something else that involves an image to drive behavior. These are the typical things that we had to choose from as we're putting our interfaces together. What you have to do though, is consider the strengths and weaknesses. Again, the designing connected product's book that I reference here has a comprehensive look at the relative strengths and weaknesses of each of these interface elements. Typically, you would look at the complexity of the information that you're trying to get across in an interaction. How hard or easy it is to update that element, how hard it is to use that interface element, what limitations users might have, is there environmental impacts on trying to use an interface, and how do users perceive and react to interacting with a given element. An example here is just physical controls. If you think about a knob, for instance, it's very direct, it's very fast. Fine adjustments possible, it's good for people with low vision. But you can't really change it with a firmware update, at least not extensively. It's hard to control externally, and there may be issues with dexterity or the force required if your user is wearing gloves, that may be a problem. These are the things you have to think about as you look at those elements and you start to put together your device interface. The other thing that's common to consider for a device is sensor-based input. You can read through this list, most of these things you'll probably recognize. But it's stunning really when you think about the types of sensors that are available for devices, and how those can be combined to come up with an interface and to make a device perform a user's tasks. Do give this a read, and think about how perhaps you could use various sensors in device designs you might make. The next question would be, what criteria would you use in order to decide which of these components or which of these sensors you might use? Part of that is very dependent on the electrical design of the system. What power do these things use? Do they have low power modes that can be placed in when they're not being used? How accurate or precise are they? What functions do they provide? How prone are they to errors? How much do they cost? How reliable are they? How do they fit the physical limitations of the device that you're making? Etc. Again, you could use this criteria to rank or weigh or compare your design choices, and later on when we get into design, we'll look at a tool called a Pugh matrix, where we'll do just that, well rank and weigh different criteria to make these hard decisions. That's really all there is for this component discussion. Again, while this is more design-oriented, I do want you to start thinking about this because these issues of selection and the criteria behind them will come up throughout user experience designs for embedded devices. Thank you.