From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search

Team Members

  • Trung Le - Electrical Engineering
  • Stephen Zhu - EE Extension
  • Josiah Engstrom - Mechanical Engineering
  • Ahmed Cheruvattam - Mechanical Engineering

Final Project Overview

For our final project, we decided to have our robocar follow and mimic the trajectory of a human-operated drone using ROS2, Open CV, Oak-D camera HSV image calibration, the lane detection package, and PID controller. In essence, we attached a yellow marking onto our drone and calibrated our Robocar's depth camera to detect the yellow marking, while ignoring other colors and noises. Once the camera was calibrated, we ensured the given amount of throttle and steering was adequate using a PID controller.

Hardware Design

All of our casings and hardware parts were designed, 3D printed, and laser cut using Fusion 360, Solidworks and Ultimaker Cure. The material used for all of our 3D printed parts are ABS. The list of parts that we designed and 3D printed are: base plate, Jetson enclosing case, Lidar case, base plate hinges, from support clip, and base plate mount.

Screen Shot 2022-09-03 at 3.20.29 PM-min.png

Electrical Configuration

IMG EDB135164C3A-1.jpeg

Camera Calibration

In this project, we had our OAK-D track a specific color Yellow while ignoring everything else. To do this, we applied several image processing techniques and made configurations to the HSV setting of the camera's feed.

Screen Shot 2022-08-29 at 8.58.46 PM.png
Screen Shot 2022-08-29 at 9.01.37 PM.png

Drone Following Demo

Here is a link to the video demo of our Robocar's actuator tracking and steering towards a yellow marked drone: https://share.icloud.com/photos/04ahPf3Jm9uAeb8aea4yv5qZA

And here is a link to the video demo of our Robocar driving and following the drone: https://share.icloud.com/photos/0d5v2bj7xPGO_Bngn2fP9eXRg

AI Behavioral Cloning

In addition to our final project, our robocar also has the capability to clone the driving behavior of any individual controlling it. Firstly, we would have a member of our team to manually control the robocar and complete several laps, while having the single board Jetson computer to collect data from different sources, including throttle input, steering input, and visual feedback from the camera feed. With the collected data, we would train our Tensor model using the GPU cluster provided by the San Diego Super Computer Center located in UC San Diego. Here is a video demo of our robocar mimicking the driving behavior of one of our team members, who happens to be a very bad and aggressive drivers: https://drive.google.com/file/d/1NhaNB3OEp7JVk7FJAcT8OtJBJW8wO5t4/view?usp=drivesdk


We would like to show our appreciation for the amazing and very supportive teaching staff who made all of this happen!

  • Professor Jack Silberman
  • TA Ivan Ferrier
  • TA Dominic Nightingale