Difference between revisions of "2021SpringTeam4"

From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search
Line 139: Line 139:

[https://i.imgur.com/VYXqQMN.png link title]
[[File:Team 4 Camera Offset Spring 2021.png]]

= Results =  
= Results =  

Revision as of 14:34, 12 June 2021

Spring 2021 Team 4

Team 4 achieved driving multiple laps on autonomous outdoor track using both Donkey Car and ROS.

For the final project we developed a car-following system.

Team's 4 Car

Library car.jpg

Team Members

  • Karen Hernandez - Electrical Engineering
  • Jan Schüürmann - Aerospace Engineering
  • Bryant Liu - Computer Engineering
  • Joe Wayne - Mechanical Engineering

The members in the picture are standing in the order mentioned above (from left to right on the image)

Team4 members spring2021.jpg


  • Mauricio de Oliveira
  • Jack Silberman

Teacher's Assistants

  • Dominic Nightingale - Mechanical Engineering
  • Haoru Xue - Electrical Engineering


Develop a tracking system for our car to follow a targeted moving car. An April Tag is placed behind the leading car for detection purposes, which will serve as a reference point for the distance and position of the leading car with respect to our car. With the measurements obtained from the April tag and by implementing ROS, we can control the throttle and steering of our car to follow the identified car.

Note: The tag size could be smaller, but given the quality of the camera we use a larger tag.

Team4s21 aprilTag setup car.jpg



The chassis given by the instructor at the beginning of the quarter is the following:

Team4s1 empty chasis.jpg

Wiring Schematic

In order to make the components work as desired, we connected them according to the wire schematic shown below. This includes all the hardware required:

Wire schematic HernandezKaren.JPG

Base Plate

For us to have flexibility when it comes to mounting hardware onto the base plate, we decided to design a plate that had mounting holes all throughout. This plate is mounted on four 2-inch stand-offs from the bottom of the chassis, such that the battery, speed controller and steering servo motor are located underneath the base plate. All other components are located on the top part of the base plate.

CAD of the base plate design

Team4s21 plate cad.png

Final product of the base plate

We used a Lasercamm for cutting a 1/4" acryllic

Note: This also shows the Jetson Nano bottom part of the 3D printed shell mounted onto the base plate. The design for the Jetson Nano shell was not design by the team, link to online free CAD is [[1]] . The only change was that we added the side mounts.

Team4s21 plate finished.jpg


The camera shell for the camera was designed so that it would protect the camera, with extended covering from the top part so that the lighting would not affect the image. The camera shell is connected by velcro to an elevated plate that is connected to the base plate by stand-offs. Four screw holes are added to provide the possibility of a stiffer connection if necessary. It is important to mention that the angle used for the camera shell design was different for each project of the class. The images below are for the camera mount and shell used on the final project.

Camera mount CAD

Camera mount

Used a MakerBot Method for PLA printing & rapid prototyping

Camera stand CAD

The ideal position of the camera mount changed depending on the task we were working on. Thus, we designed a stand with adjustable height where the camera mount was attached to. The elongated holes are for the optional screw conncetion of the camera casing and helps with centering the mount.

Team4s21 stand camera mount.png

Camera stand



apriltag provide means of identification and 3D positioning, these are similar to QR codes but April Tags are designed to encode a significantly smaller amount of data.

ROS Wrapper

The apiltag_ros package provides access to the AprilTag core and makes the information of detected images and tag poses available over ROS topics.


Team4s21 april tag diagram.png

ROS Package

Robot Operating system (ROS) is a collection of software frameworks which facilitate robot software development. For our final project we created a simple ROS package using OpenCV and April Tags in order to de able to detect a specified tag on another 1/10 RC car and control the throttle and steering such that our car can follow the other car based off the april tag position and distance. We will be using the Noetic distribution.



OpenCV is library that provides real-time optimized computer vision and image processing tools. It was created such that most of its functions integrate well with other libraries, such as Matplotlib, Numpy and Scipy, with the purpose of visualizing and processing data. We will be using the cv2 Python interface. OpenCV will be used to detect the April Tags.


Adafruit is a library that facilitates the implementation of python code to control servos with the Adafruit 16-channel servo driver. This library provides a high-level interface with low-level PWM controls. For this project, the library is used to control the PWD which manages the steering servo and the electronic speed controller (ESC).


In order to be able to transfer data between ROS and OpenCV, we will use the cv_bridge which converts data such that is workable between ROS Image messages and OpenCV images.


All files we used can be found in our Github repository.



In terms of the software, our calibration for Apriltags required us to take 50+ images of a printed-out checkerboard pattern in an effort to have an accurate camera to reference. We followed the same steps listed in the video below.

{{#evu:https://www.youtube.com/watch?v=UGArg1kQwFc?rel=0 |alignment=center }}

Team 4 Camera Offset Spring 2021.png


Ultimately the team was extremely happy with the results from our final testing. We successfully completed our original goal of an autonomous lap around the Warren Tents track, following another human-controlled car with an Apriltag on the back. The trailing car successfully finished a lap without ever crashing into a cone or going out of the designated lanes. The distance was set to be around half a meter with our throttle and steering controllers.

Final Testing

Observe our car detecting the Apriltag on the leading car, then adjusting the throttle and steering angle in order to minimize the distance and alignment offset between the cars.

{{#evu:https://www.youtube.com/watch?v=ejD80HDI3j8?rel=0 |alignment=center }}

Suggestions for Further Development

With another 1-2 weeks, the team would have liked to potentially test this system with 2+ following cars, with the potential to go up to the maximum 7 following cars that we have access to in the class. While other team's systems require extra camera/sensor systems, our Apriltag system works with the existing camera system that every team has from the previous weeks. So it should not be too difficult to implement our software onto their devices, albeit with every camera needing to be calibrated.

In addition, we would have liked to have spent more time tuning the controllers for maximum performance. This would have been significantly easier with us with more testing time with our final overall design. Furthermore, if the team could figure out how to properly send negative throttles (for emergency braking), we would have been able to try significantly more aggressive proportional gains. On top of that, we would have liked to have implemented the ability for our code to recognize different tag numbers and thus assign predetermined (yet individual) parameters to our controllers, such as following distance and PID values.

We would have also liked to have added a system that would be able to combine our Apriltag following system with a more general use case of lane detection.

Of course, it would have been significantly more ideal if we could have gotten our original plans of using the Intel L515 LiDAR system for more accurate depth detection, but future teams can try to get around this by first doing a fresh install of the Jetson OS (hopefully).

Finally, we would suggest for future teams to check to see if the GPU on the Jetson is being properly utilized, as the biggest bottleneck for us was the publishing on the camera image topic. While the camera itself is capable of significantly higher FPS, we were able to get around 15 FPS after converting the image to grayscale and cropping the image to reduce the overall file size.