Welcome back to the second Programming & Simulation lecture. This week I will talk about Differential Drive, Odometry and Infrared Sensors. In fact, this week's programming assignment is focused on implementing the robot's sensors and actuators. Therefore, you will need to transform from the unicycle dynamics in the controllers to the differential drive dynamics and the mobile robot. You also need to use the wheel encoder to keep track of the robot's position and orientation. This is called odometry. And third, you'll need to use the infared sensors to measure distance to any obstacles that might be around teh robot. The simulator emulates the QuickBot, and the QuickBot is defined in the quickbot.m file. This is a class and it's located in the robot package. Packages in MatLab are denoted by a plus in front of the folder name. This robot has a two-wheel differential drive, meaning that each wheel is independently, controlled by a single motor. And, it also has a wheel encoder for each one of the wheels. And these wheel encoders are responsible for measuring the distance that each wheel travels. And they have a resolution of 32 ticks per revolution. And the robot also has five infrared sensors and these infrared sensors can detect any obstacles, that are within a range of four to thirty centimeters in their field of view. Following is a little diagram to explain to you how the simulation actually works. First of all, there is the simulator and it has a timer object. And this timer object calls at every time step the execute function inside the supervisor. The supervisor is really the decision maker and it's responsible for, first of all, gathering the sensor information from the robot. It then selects a controller. And passes the censor information along with an estimate of the robot's position and orientation to the execute function in the controller. This execute function in the controller, then computes the appropriate linear and angular velocity of the robot which is then, converted to the left and right real speeds of the robot and passed back to the robot. Supervisors also responsible for updating the estimate of the robots position in orientation and this whole process is repeated at every time step. The supervisor itself was defined in the QB supervisors.Mfile where as all of the controllers are found in the controller package. The first part of this assignment is, to convert the linear angle of velocity in the controllers to the left and right angular wheel velocities on the mobile robot. You'll do this in the unit diff function which is defined in the differentialdrive.m file. This function takes in the linear and angular velocity of the robot and returns the left and right wheel speeds. The skeleton code show you how to retrieve the radius of the reel r and the distance between the two reels, l. What you need to use to calculate the left and right wheel speeds. The second part of this assignment is odometry. What you need to do is to measure the distance traveled by each wheel and estimate the position and orientation of the robot using those measured distances. You will implement this in the update odometry function, which is inside the QBSupervisor.m file. Let's take a look at the skeleton code for this function. Here you have the ske, skeleton code for this function. First you will retrieve wheel encoder takes from the robot, then you will recall the previous wheel encoder takes; which is used Which is achieved by retrieving the previous text stored in the pre, from the, in the previous iteration of the code. Then, you will use, then you will retrieve the current state estimate, meaning, what is the current estimated position and orientation of the robot. Then, you will retrieve the, the useful constants such as the radius of the wheel, the distance between the wheel, and. How many readers are there for each tic of, each tic of the encoder. Then your part comes, which is to implement odometry by computing the difference in the X position, the difference in the Y position, and the difference in the orientation between now and the last time the odometry was updated. You will then add these deltas to the current estimate of the pos, current estimate of the orientation and position, and save, and save that as the new estimate. You will also need to save the current number of ticks in the right and left wheels for the next iteration. [INAUDIBLE] Part of these assignments is to make sense of the IR distance sensors. In particular, these sensors will return votages in the range from 0.4 to 2.75 and those correspond to distances between four and 30 centimeters. The relationship between distance and voltage, is shown in the graph on the left. Now what, to make matters a little bit more complicated, is that the robot actually returns a integer value in the range of 200 and 1,375. What these values simply correspond to the voltage divided by two then multiplied by 1,000. Since the graph on the, in the previous slide is not linked is not linear. We need to use the [UNKNOWN] polyfit to find a fifth-order polynomial that will fit the data. We will then use the coefficients from that function and the polyval function to convert from the integers to an actual distance in mirrors. You will do this in the get IR Distances function. This function is defined in Quickbot.m. And you need to implement two specific parts. First you need to properly convert from the integers to the voltage, and then you need to hard code the coefficients that were output by the polyfit function. To help you test the assignment, we've included a go to angle controller. This is a P regulator, which will simply steer the robot to a specified angle. If you wish to change the angle to which this robot steers, you need to go into the constructur and QBSupervisor.m and change this variable, right here. Object.theta_d. I have it set to pi divided by four, so we'd expect that the robot would drive off at a 45 degree angle with respect to the x axis. So let's see this in action. I'm going to hit play and what we should see is the robot. Drived at a 45 de, de, degree angle. I'm going to move the graph out of the way and we're going to follow this robot. And as you can see it's driving at 45 degrees with respect to the x axis. It's going to just continue driving at this angle all the way until it's going to crash into the thread wall. There you go, we found a crash. You'll see in the, in the command window, that I've actually been spitting out the estimated posts from the odometry at every time step, and you can see that the values that I'm getting for the angle are close to pi divided by four. So this controller successfully steers the robot toward that angle. My tips for this week are to, first of all, make sure you read the section on week two in the manual for more details than you find in the slide, in these slides. And also, to use the commented out fprintf statements that I've included in the comments in the code. Or into, maybe add some more, more of your own and this is really useful for debugging because for example, when you're dealing with implement in the odometry, you want to print out what you're currently estimating the position of the robot to be. You know for example you can easily see that my robot drove at 45 degrees, but I also needed to print that out to the command line to make sure that those values not jump. And that's it for this week.