In this video, we continue discussing object interaction. Previously, we have introduced different level of user input you can have with different VR systems currently on the market. With most high-end VR systems, user's hand rotation and position, or HR and HP, are both tracked. The position and rotation of the pair of controllers they hold in their hands, or CR and CP, are also tracked. Also, those controllers normally come with some basic 2D user interface or 2DUI such as buttons and joysticks. Another way to look at how these VR systems support user input is to define the tracking capacity with degree of freedom or DoF. When the position of an object is fully tracked in a 3D space, that is all its x, y and z axes are tracked, so there is no ambiguity where this object is, we say that it is position tracked with three degree of freedom. The same applies on rotation tracking. When the orientation of an object is fully tracked, that is all its rotation are tracked including pitch, roll and yaw, we say it is rotation tracked with three degree of freedom. And we can add to those two types of degree of freedom together. So when an object whose position and orientation are each tracked with three degree of freedom, we can say that this object is tracked with six degree of freedom. Using this method, we can describe user input in the high-end VR system as users are hand-tracked with six degree of freedom and we'll be holding two controllers with simple 2DUI and are also six degree of freedom-tracked. This is an important starting point in designing object interaction in VR. As for most tasks in this area, we'll be relying on a system's ability to track our hands. Other than the controllers that come with the current standard VR systems, there are other motion-tracking devices we can use to track our hands. These include the Polhemus magnetic tracker and markerless systems such as Leapmotion. There are also several VR gloves currently on the market. Both the Leapmotion and the VR gloves tracks not only the hands but also detailed finger movements which enables more expressive ways to interact with objects like we do in real life. The VR gloves normally also come with some built-in device for force feedback, which could be used to produce effects similar to the vibration feedback generated by the VIVE controllers. There are many other types of tracking and force feedback devices out there on the market, and we should see many more coming soon. However, before any of them become an integrated part of a popular VR system, it might not be a good idea to design specific object interactions relying on any of these devices, as this way, you will risk losing the mass market where majority of users do not have access to them. Let's take a look at how user input and system feedback are used in one of the basic object interaction tasks, object selection. When selecting an object, the user should first be able to indicate the object of interest by either touching it with their hands, or if the object is out of reach, by pointing at it. But this is surely not enough as even in real life, we often touch and point at objects accidentally without intention to interact with it. So the user should be able to confirm their selection by pressing a button or they can also use their voice to confirm. Finally, as in real world when we grasp an object, we would expect to feel it. System should provide some kind of feedback ideally in the format of haptic feedback such as vibration. But if that's not possible, you can use change of color or sound.