This is the final lab. In this lab, I integrate everything that I've learned this semester to perform planning and execution within a map. The goal of this lab is to have the robot navigate through a set of waypoints in a map as quickly and as accurately as possible.
Back to mainpageIn labs 1, 3-5, I built a working robot capabale of moving and sensing distance + orientation. In lab 2, I implemented debugging capabilities using BLE (bluetooth). In labs 6-8, I implemented closed loop control to get the robot to move a certain distance and rotate by a certain angle. In labs 9-11, the robot learned to map its environment and localize itself within it. Using all of these features, the goal of lab 12 is to navigate through a set of waypoints within a map.
The instructions for this lab were very open-ended, meaning that we had to come up with our own strategy for navigating around the map. First, the map that we had to navigate the robot around was the same map used in labs 9-11. The waypoints to navigate around are shown in the picture below:
The exact coordinates (in feet) of the way points are:
The big picture strategy was as follow:
To achieve this, I defined 4 functions in Arduino (this was mostly just putting code from previous labs into separate functions):
The code for the functions are below:
The next step I took was to test these functions. Here are videos of the robot turning 120 degrees and 330 degrees using the pid_ori(setpoint_ori) function:
As you can see, the orientation control was very accurate.
Next, I tested the move_distance(dist) function. Here are videos of the robot moving 50 cm forward, 100 cm forward, 100 cm backward as marked by the green tape on the floor:
One observation I made was that as dist increases, the error (actual distance traveled - dist) increases. So I needed to adjust for this later when I executed the navigation task. Nonetheless, the position control was also very accurate. Also, there was no need to test pid_pos(setpoint_pos) separately because it's gets called in move_distance(dist) and because this function works pretty well, I knew that pid_pos(setpoint_pos) worked pretty well. Also, since I had completed localization for lab 11 just the week before (and it performed reasonably well), I didn't feel a need to test localize() and decided to move onto the next task.
Before using localization to perform navitgation, I decided to try to navigate the waypoints using PID control only (i.e. using pid_ori and move_distance only). The robot would start facing rightwards. One immediate problem I ran into was that when the robot tried to get from the first waypoint to the second waypoint, the wall it was facing was about 3.4 m away. One of the issues with the TOF sensor on the robot is that it gives very unreliable readings when the distance it's measuring is very far (> 3 m). Because the TOF readings were quite inaccurate, this caused PID position control to be very inaccurate. When I consulted Anya with this issue, she suggested that I rotate my robot 180 degrees so that it was facing the wall behind it which was a lot closer (less than 1 meter). This was a really good idea and it worked. With that in mind, the algorithm for navigation using PID control is the following:
Algorithm translated into C++ code:
This method worked really well. I was able to get the robot to successfully navigate through the waypoints on the 3rd attempt. Having a robus PID controller (as shown in the previous videos) for both position and orientation made things a lot easier. The following is a video of a successful navigation attempt:
An honest note: I didn’t expect it to fully work the 3rd attempt so I didn’t record it from the start...In the next attempt, it worked up to the 4th waypoint but got stuck on one of the walls afterwards. So I merged the two videos together. But I swear it did naviagte successfully through all of the waypoints on the 3rd attempt!
Now that I got things to work using just PID control, next, I wanted to try and get it to work using localization. First, I determined the desired poses at each of the waypoints:
The robot is at 0 degree angle when it points to the right. From the zero point, the angle becomes more positive in the counterclockwise direction and becomes more negative in the clockwise direction. So +90 degrees corresponds to the robot facing upwards and -90 degrees corresponds to the robot facing downwards. With that in mind, I created a list containing the desired poses at each of the waypoints:
The algorithm for navigation is as follows:
Translated into Python code:
Note that I had to set up commands LOCALIZE, MOVE_DISTANCE, ROTATE on the Arduino side so that the functions localize(), pid_ori(setpoint_ori), move_distance(dist) could be called from the laptop via BLE. The code for these commands is below:
Unfortunately, I could not get a successful run using localization. The problem I had was that I got very inaccurate localization beliefs. Nonetheless, I still learned much from trying to get navigation working using localization.
Unlike navigation using PID, using localization requires a lot of offboard computation which involves data to be sent back and forth between the robot and the laptop. This really made me realize the value of everything we did in lab 2. BLE is a powerful tool that allows all of the heavy data processing and computation to be done offboard by a device with more computational power than the Artemis microcontroller. The Artemis microcontroller did not do any heavy data processing during this lab - most of the code on the Artemis was just actuating commands (moving by a certain distance, rotating by a certian angle).
Another thing that I came to appreciate was the purpose of PID control. I realized that the entire point of using PID control is that it removes any trouble due to a lot of the changing external factors like changing battery levels, different friction coefficients for different floors, etc. The whole point of PID controller is that it accomodates for these changes (which could be considered a disturbance input) by varying the control input.
Overall, this lab was extremely rewarding despite the long hours in the lab. I learned so much from trying to put all the pieces together from the previous labs to get the robot to perform a complex task like navigation. Everything truly came together and made me appreciate all that we did in the previous labs. Many thanks to the course staff (Prof. Petersen and all of the TAs) and my peers for the wonderful learning experience this semester!