2020FallTeam2

From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to: navigation, search

Project Team Members

  • Zhaobin Huang – Electrical Engineering (MS)
  • Charles Roman – Mechanical Engineering (BS)
  • Owen Cruise – Mechanical Engineering (BS)

Team 2 Photo.JPG Hands-on engineering...6 feet apart

Project Overview

In this project we were tasked with navigating an outdoor closed-loop track with a 1/10-scale remote control car. Through the use of a uniquely-developed robot operating system (ROS) package, we programmed our RC car to successfully complete 5 laps around the track autonomously. Our first step involved calibrating a working model with a sample computer vision (CV) client from OpenCV and tuning the behavior with proportional-integral-derivative (PID) control. After this was completed, we were able to develop and document a ROS package for the car that utilized this model in order to reproduce successful autonomous behavior. This project allowed us to learn the basics behind computer vision, among other skills, while implementing newly developed skills in ROS to complete the autonomous laps.

"Must-Have" Project Requirements

  • Target/line following using OpenCV code
  • Integrated ROS package to implement autonomous laps
  • Image processing using the connected USB webcam for sensing and steering/throttle actuation
  • Fully autonomous track laps with no manual user input

Hardware

Mechanical Design

20FA-group2-camera mount.png

  Figure 1: Camera Mount designed in Solidworks

Our camera mount design was inspired by some of the relatively simpler designs devised in prior MAE 148 projects. The mount requires the camera to be fastened directly by the means of 4 small bolts and allows for the camera lens to fit safely inside the designed pocket which should prevent the lens from damage in the unfortunate event of a collision. The camera mount was rigid and only allowed the camera to look straight forward, but proved to work just fine after some testing (no angular adjustment was needed). The mount was 3D printed using ABS.


Acrylic Plate.png

  Figure 2: Electronics Mounting Plate designed in Solidworks

The mounting plate is constrained at 4 separate posts present on the RC car chassis and provides a secure platform to mount all of the electronic devices, the camera mount, as well as our on-board computer. The plate contains several different hole settings for the camera mount to allow its position to be adjusted to anywhere along the platform. Additionally, there is a long cut-out down the middle to allow for efficient wire routing. The plate was laser cut out of acrylic.


Cover image.JPG

  Figure 3: Jetson Nano Protective Case 3D taken from GrabCAD

The Jetson Nano protective case design was borrowed from GrabCAD and was printed by the Design Studio staff. This case is necessary to protect the on-board computer from any potential damage that could occur from a collision or troublesome debris. It offers this protection while still providing access to the onboard ports (USB, ethernet, 12 VDC power, etc).

Electronics

20FA-Group2-Schematic.png

  Figure 4: Circuit Schematic of electronic components on-board our robot

Included in this schematic above is all of the necessary electronic components required for the successful and safe functioning of the robot. This includes:

  • Jetson Nano with micro-SD card and fan (car's main computer)
  • Pulse-width modulation (PWM) board, equipped with a servo for electronic speed control (ESC)
  • 1080p USB webcam (forward-facing image detection)
  • Webcam w/ audio sensor (rear-facing view)
  • Mini light-emitting diode (LED) board with red and blue LEDs (indicates whether emergency relay switch is active or inactive)
  • Voltage checker/current indicator (alarm for low battery life)
  • Emergency relay switch (wireless link to PWM board for emergency stops)
  • Battery (11.1-volt, 3000-mAh from Turnigy)
  • "Buck" DC voltage regulator

Assembly

Team 2 Vehicle Lab.JPG

  Figure 5: Complete Assembled Autonomous RC Car

Here is a complete look at the final version of our autonomous robot.

Software

OpenCV Implementation-

Our simple OpenCV racer is based on the simple_cv_racer code listed in the reference. Here are some of the modifications we implemented to make it work for our robot:

  • Fix Nonetype Error
  • Change the upper and lower threshold of HSV values for detecting yellow lines
  • Calibrate steering and throttle
  • Keep uniform throttle to make the car more stable
  • Use OpenCV to read image and adafruit-circuitpython-servokit to control PWM board instead of Donkey Car framework so that it can fit into ROS later
  • Adjust the detecting window to adequately detect yellow lines at the track
  • The default PID controller works well at low speed so we did not need to alter the PID parameters.

Some early HSV experimentation:

20fa-Group2-opencv test.gif

ROS Implementation

20FA-Group2-ROS racer package2.png

  Figure 6: ROS_racer schematic

Our ros racer is also based on the simple_cv_racer as seen in the reference section. Instead of using the DonkeyCar framework, now we are using our own developed ROS package. We started by separating the task into 3 sub-tasks:
1. Read the image from camera: ROS_racer

  • Read the image from camera with OpenCV


2. Process the image with OpenCV: ROS_racer

  • Crop the image to avoid noise
  • Use HSV thresholding to get a mask of yellow pixels
  • Keep track of the column with most yellow pixels
  • Generate steeing signal with PD control
  • Publish throttle and steering signal to /throttle and /steering topic


3.Send the motion control signal to PWM board: Throttle_subscriber and Steering_subscriber

  • Subscribe from /throttle and /steering topic and send PWM signals to control RC car using adafruit-servokit package

Milestones

DonkeyAI Initial Concept Generation and Early Testing

DonkeyAI autonomous laps













These laps were performed at the EBUII track. Here is a view from our webcam showing its color matching algorithm. Note that this is an early revision, and the parameters were later adjusted.

Autonomous Driving (OpenCV)

OpenCV line following












Here is a video showing our OpenCV line following in the Warren Mall tent.

Autonomous Driving (ROS)

ROS autonomous laps












Here is a video showing our car performing five autonomous laps using ROS architecture to process images fed by the webcam and convert those images to steering and throttle inputs.

Future Goals and Suggestions

What we would have done if we had had more time

20FA-Group2-ROS racer package.png

  Figure 8: Original ROS_racer schematic

We have a problem that cv_bridge 1.13 for ROS melodic doesn't support working with Python3 and OpenCV4, so we could not convert a ROS Image to cv2 image. We did not have enough time to solve that problem.

If We Had a "Redo" Button

  • With more time and access to the lab we would have attempted to develop software to allow for object tracking.
  • We would have liked to essentially set up an intelligent cruise control that allowed for our car to accurately "trail" a leading car at a preset distance.

Our Advice

• Our best advice for future teams would be to start troubleshooting early. You cannot expect than everything will run just fine in the immediate testing of programs you set up even if they are structured correctly and make logical sense. There will almost always be things you cannot account for whether that pertains to outdated/uninstalled software (i.e. missing libraries, packages or previous versions of software like Python) or changes in the configuration of the hardware (i.e. need to re-calibrate PWM when exposed to a battery with different level of charge)

ROS is a large subject area, and in this class we have learned some basics of ROS. Looking ahead, we can do a deep-dive into an area of interest within ROS. The ROS wiki has been very useful as we have been learning and debugging.

The class was run quite well despite the impact of COVID-19. We only hope that future years of this class get to enjoy a full in-classroom experience.

Acknowledgments

Special thanks to Professor Jack Silberman, Professor Maurício de Oliveira, and the UC San Diego MAE department for allowing us to gain this valuable project experience as safely and as efficiently as possible given the current circumstances. We thank you immensely for the effort you put in this Fall quarter to allow us to learn, develop, and grow before heading out into industry.

References

Jetson Nano protective case: [[1]]

Simple openCV racer: [[2]]

Finding correct upper and lower HSV threshold: [[3]]

cv_bridge ROS Package [[4]]