From MAE/ECE 148 - Introduction to Autonomous Vehicles
Revision as of 14:12, 10 June 2021 by 2021SpringTeam6 (talk | contribs) (OpenCV)
Jump to: navigation, search

Project Introduction

One of the things that we noticed with the Donkey car and ROS car, that would be a major problem for an actual passenger vehicle is the lack of object avoidance or the ability to stop during emergencies. While more expensive sensors exist for implementing object avoidance, like LIDAR, we thought that less expensive solutions could be made. For implementing object avoidance on our car, we chose to go with the inexpensive HC-SRC04 Ultrasonic Sensor.

Project Team Members

  • Dylan Perlson (ECE)
  • Harrison Lew (MAE)
  • Joshua Guy (MAE)
  • Arman Mansourian (CSE)



NVIDIA Jetson Nano, Camera (get specific), Adafruit PWM Controller, Electronic Speed Controller, 12V-5V DC-DC voltage converter, Throttle DC motor, Steering Servo Motor, 11.1V 3S LiPo Battery, Voltage Sensor, Remote relay, HC-SRC04 Ultrasonic Sensor

Physical Design

Wiring Schematic

We used the wiring schematic from Fall 2020 Team 2 to get a head start on our wiring and allow us time to implement our additional components. Our final circuit was similar to theirs except included the ultrasonic sensor that we implemented in our design with a voltage divider for conditioning the signal to a readable level for the Jetson Nano (insert diagram here)



For OpenCV, we had to modify several parameters for our implementation. We had to calibrate the pwm values for the throttle motor and for the steering servo and find appropriate control values for achieving the turning performance that we wanted. We also had to calibrate the color filters and max/min width of line detection. A consistent issue that occurred and had to be dealt with was the changing light conditions of the tent in which testing was done, meaning that the color calibration had to constantly change.

ROS (Robot Operating System)




Further Improvements