- 1 Spring 2021 Team 4
- 2 Team Members
- 3 Instructors
- 4 Objective
- 5 Hardware
- 6 Software
- 7 Results
- 8 Suggestions for Further Development
Spring 2021 Team 4
Team 4 achieved driving multiple laps on autonomous outdoor track using both Donkey Car and ROS.
For the final project we developed a car-following system.
Team's 4 Car
- Karen Hernandez - Electrical Engineering
- Jan Schüürmann - Aerospace Engineering
- Bryant Liu - Computer Engineering
- Joe Wayne - Mechanical Engineering
The members in the picture are standing in the order mentioned above (from left to right on the image)
- Mauricio de Oliveira
- Jack Silberman
- Dominic Nightingale - Mechanical Engineering
- Haoru Xue - Electrical Engineering
Develop a tracking system for our car to follow a targeted moving car. An April Tag is placed behind the leading car for detection purposes, which will serve as a reference point for the distance and position of the leading car with respect to our car. With the measurements obtained from the April tag and by implementing ROS, we can control the throttle and steering of our car to follow the identified car.
Note: The tag size could be smaller, but given the quality of the camera we use a larger tag.
The chassis given by the instructor at the beginning of the quarter is the following:
In order to make the components work as desired, we connected them according to the wire schematic shown below. This includes all the hardware required:
For us to have flexibility when it comes to mounting hardware onto the base plate, we decided to design a plate that had mounting holes all throughout. This plate is mounted on four 2-inch stand-offs from the bottom of the chassis, such that the battery, speed controller and steering servo motor are located underneath the base plate. All other components are located on the top part of the base plate.
CAD of the base plate design
Final product of the base plate
We used a Lasercamm for cutting a 1/4" acryllic
Note: This also shows the Jetson Nano bottom part of the 3D printed shell mounted onto the base plate. The design for the Jetson Nano shell was not design by the team, link to online free CAD is [] . The only change was that we added the side mounts.
The camera shell for the camera was designed so that it would protect the camera, with extended covering from the top part so that the lighting would not affect the image. The camera shell is connected by velcro to an elevated plate that is connected to the base plate by stand-offs. Four screw holes are added to provide the possibility of a stiffer connection if necessary. It is important to mention that the angle used for the camera shell design was different for each project of the class. The images below are for the camera mount and shell used on the final project.
Camera mount CAD
Used a MakerBot Method for PLA printing & rapid prototyping
Camera stand CAD
The ideal position of the camera mount changed depending on the task we were working on. Thus, we designed a stand with adjustable height where the camera mount was attached to. The elongated holes are for the optional screw conncetion of the camera casing and helps with centering the mount.
apriltag provide means of identification and 3D positioning, these are similar to QR codes but April Tags are designed to encode a significantly smaller amount of data.
The apiltag_ros package provides access to the AprilTag core and makes the information of detected images and tag poses available over ROS topics.
Robot Operating system (ROS) is a collection of software frameworks which facilitate robot software development. For our final project we created a simple ROS package using OpenCV and April Tags in order to de able to detect a specified tag on another 1/10 RC car and control the throttle and steering such that our car can follow the other car based off the april tag position and distance. We will be using the Noetic distribution.
All files we used can be found in our Github repository.
To run: 1. Install all dependencies in next section.
2. Make sure your current environment has ROS
3. In the first terminal:
4. In the second terminal:
roslaunch apriltag_ros continuous_detection.launch
5. In the third terminal:
roslaunch apriltag_ros throttle_and_steering_launch.launch
6. In the fourth terminal:
rosrun apriltag_ros steering.py
7. In the fifth terminal:
rosrun apriltag_ros controller2.py
OpenCV is library that provides real-time optimized computer vision and image processing tools. It was created such that most of its functions integrate well with other libraries, such as Matplotlib, Numpy and Scipy, with the purpose of visualizing and processing data. We will be using the cv2 Python interface. OpenCV will be used to detect the April Tags.
Adafruit is a library that facilitates the implementation of python code to control servos with the Adafruit 16-channel servo driver. This library provides a high-level interface with low-level PWM controls. For this project, the library is used to control the PWD which manages the steering servo and the electronic speed controller (ESC).
In order to be able to transfer data between ROS and OpenCV, we will use the cv_bridge which converts data such that is workable between ROS Image messages and OpenCV images.
In terms of the software, our calibration for Apriltags required us to take 50+ images of a printed-out checkerboard pattern in an effort to have an accurate camera to reference. We did further calibration using size of the Apriltags that feeds into the program to generate accurate distance measurements. We followed the same steps listed in the video below.
While we believe we performed the software calibration of our camera correctly, it appears our camera lens is internally shifted left. This shift isn't due to a direct linear offset, so we couldn't solve it easily with a software solution. Thus in order to center the camera down the middle of the car, we were forced to rotate the camera mount to the right. This was made significantly easier due to us switching to a velcro mounting system.
Ultimately the team was extremely happy with the results from our final testing. We successfully completed our original goal of an autonomous lap around the Warren Tents track, following another human-controlled car with an Apriltag on the back. The trailing car successfully finished a lap without ever crashing into a cone or going out of the designated lanes. The distance was set to be around half a meter with our throttle and steering controllers.
Observe our car detecting the Apriltag on the leading car, then adjusting the throttle and steering angle in order to minimize the distance and alignment offset between the cars.
Suggestions for Further Development
With another 1-2 weeks, the team would have liked to potentially test this system with 2+ following cars, with the potential to go up to the maximum 7 following cars that we have access to in the class. While other team's systems require extra camera/sensor systems, our Apriltag system works with the existing camera system that every team has from the previous weeks. So it should not be too difficult to implement our software onto their devices, albeit with every camera needing to be calibrated. We would also liked if we were able to fix the camera offset issue as well.
In addition, we would have liked to have spent more time tuning the controllers for maximum performance. This would have been significantly easier with us with more testing time with our final overall design. Furthermore, if the team could figure out how to properly send negative throttles (for emergency braking), we would have been able to try significantly more aggressive proportional gains. On top of that, we would have liked to have implemented the ability for our code to recognize different tag numbers and thus assign predetermined (yet individual) parameters to our controllers, such as following distance and PID values.
We would have also liked to have added a system that would be able to combine our Apriltag following system with a more general use case of lane detection.
Of course, it would have been significantly more ideal if we could have gotten our original plans of using the Intel L515 LiDAR system for more accurate depth detection, but future teams can try to get around this by first doing a fresh install of the Jetson OS (hopefully).
Finally, we would suggest for future teams to check to see if the GPU on the Jetson is being properly utilized, as the biggest bottleneck for us was the publishing on the camera image topic. While the camera itself is capable of significantly higher FPS, we were able to get around 15 FPS after converting the image to grayscale and cropping the image to reduce the overall file size.