Course on "Fast Robots", offered Spring 2022 in the ECE dept at Cornell University
This project is maintained by CEI-lab
The purpose of this lab is to equip the robot with sensors - the faster the robot can sample and the more it can trust a sensor reading, the faster it is able to drive. Note that this lab has two components related to the Time-of-Flight Sensors (ToF) and the Inertial Measurement Unit (IMU) - each component has a prelab. Be sure to start in good time. This lab will take considerably longer than Lab 1 and 2!
In the first half of the lab, we will setup the Time-of-Flight (ToF) sensor, which is based on the VL53L1X. Please skim the manual, and check out the datasheet before beginning this lab. Note the sensor I2C address.
The address of the sensor is hardwired on the board which means you cannot (as is) address the two sensors individually. You can change the address programmatically (while powered) or you can enable the two sensors separately through their shutdown pins. Decide and argue for which approach you want to use. Given their range and angular sensitivity, think about where you will place them on your robot to best detect obstacles in future labs. Discuss scenarios where your robot will miss obstacles.
The purpose of the second half of the lab is to setup your IMU, which will enable your robot to estimate its orientation and angular rate of change. Note that there are several ways to compute the orientation, through lectures and this lab you should understand the trade-offs of each approach.
To help you get through the lab, consider installing SerialPlot to help visualize your data.
Also, read up on the IMU. While Sparkfun has a different breakout board, their information gives a nice quick overview of the functionality, and their software library works well. The ICM-20948 datasheet can be found here.
Think ahead!
While you can choose to ignore the robot in this lab, it is worth hooking up all connectors and routing the wires with their position in the robot in mind, such that you won’t have to redo too much later on. Discuss your thinking in the write-up. Sketch out a diagram of all the wires you will need to connect:
Scan the I2C channel to find the sensor: Go to File->Examples->Wire and open Example1_wire. Browse through the code to see how to use i2c commands. Run the code. Does the address match what you expected? If not, explain why.
The ToF sensor has three modes, that optimize the ranging performance given the maximum expected range. Discuss the pros/cons of each mode, and think about which one could work on the final robot.
<pre>
.setDistanceModeShort(); //1.3m
.setDistanceModeMedium(); //3m
.setDistanceModeLong(); //4m, Default
</pre>
Test your chosen mode using the “..\Arduino\libraries\SparkFun_VL53L1X_4m_Laser_Distance_Sensor\examples\Example1_ReadDistance” example and a ruler. Document your ToF sensor range, accuracy, repeatability, and ranging time. Check and discuss whether the sensor is sensitive to different colors and textures.
Many distance sensors are based on infrared trasmission. Discuss a couple, highlight the differences in their fuctionality and the pros/cons of each.
The ToF sensor has a timing budget that affects sensor accuracy, range, measurement frequency, and power consumption. For more detail please check Section 2.5.2 in the manual. Below are some relevant functions. Try out some settings, and discuss your choice of settings related to the anticipated speed of your robot.
<pre>
.setIntermeasurementPeriod();
.setTimingBudgetInMs();
</pre>
Not all ToF measurements will be accurate, especially when the robot is moving fast. To deal with this, the ToF sensor has two parameters, “signal and sigma”, that tell you whether the measurement you read is valid. These features are detailed in Sec. 2.5.4 in the manual. To see how often failure modes occur, run the “..\Arduino\libraries\SparkFun_VL53L1X_4m_Laser_Distance_Sensor\examples\Example3_StatusAndRate” example. Try rapidly changing the distance from the sensor to an object and describe what happens. Consider how this will impact your motion implementation in future labs.
<pre>
xm = myICM.magX()*cos(pitch) - myICM.magY()*sin(roll)*sin(pitch) + myICM.magZ()*cos(roll)*sin(pitch); //these were saying theta=pitch and roll=phi
ym = myICM.magY()*cos(roll) + myICM.magZ()*sin(roll_rad);
yaw = atan2(ym, xm);
</pre>
To demonstrate that you’ve successfully completed the lab, please upload a brief lab report (<1.000 words), with code snippets (not included in the word count), photos, and/or videos documenting that everything worked and what you did to make it happen.