From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search

Truck Detector

A 1/10 scale autonomous vehicle that can detect images of trucks using OpenCV. Initiates an emergency brake, plays an alarm, and pauses autopilot upon detection.

Truck detecting3.jpg

Autonomous Driving Demo


Team Members


Tesla vehicles have had a history of fatal accidents when in autopilot due to being unable to recognize semi-trucks. Semi-trucks bypass the image recognition system in the autopilot system and result in head-on collisions. Motivated by these terrible accidents, our proposed Vehicle Identification & Alarm System will identify images of vehicles along the track by adding a speaker and announcing the vehicle type: "Semi Truck Detected," "Bus Detected," "SUV Detected," "Sedan Detected," etc. The purpose of this project is to utilize ROS2 (Robotic Operating System) and Docker containers to develop software and incorporate them into the hardware of donkey cars such as NVIDIA Jetson Nano, and OAK_D webcam to enable autonomous driving. And we also used YOLO (You Only Look Once) algorithm, a real-time object recognition algorithm, to classify and localize objects in a single frame.

Our car will be able to use computer vision to detect and follow the yellow line on the lane by itself using ROS2, and at the same time, it will also be able to classify the images of vehicles detected along its way. Once it detects the images, the speaker will announce the vehicle type detected. In order to multitask, we used two cameras: one at the front (OAK_D) for lane detection to drive autonomously, and one on the side (webcam) for OpenCV image processing to classify vehicle images.

Gantt Chart

Gantt chart.png


CAD Designs

Camera Mount

This design is meant to hold a camera between the arms and LiDAR on top.
pic1CameraMount Back.jpg

Bast Plate

BasePlate v2.jpg

Jetson Nano Case


Completed Build



List of Components

  1. Traxxas Chassis
  2. NVIDIA Jetson Nano
  3. OAK-D Stereo Camera
    • A camera is installed at the front of our robot to detect lanes and drive autonomously.
  4. USB Webcam
    • A camera is installed on the side of our robot to detect the images.
  5. Speaker w/ AUX input
  6. USB Audio Adapter
  7. USB Hub
  8. LD06 LiDAR
    • LiDAR - "Light Detection and Ranging" - is used for 3D laser scanning to detect the surface of the objects within a range, and measure the time for the reflected light to come back to the receiver.
  9. LiDAR Controller
    • Received data from LiDAR, and send it to Jetson Nano.
  10. VESC (Vedder's Electronic Speed Controller)
  11. UBEC (DC/DC converter)
  12. 3-Cell LIPO Battery
    • Provides electrical power
  13. Anti-Spark Switch
  14. Servo Motor
    • Electromechanical device that regulates the speed by producing forces or torques depending on the input current and voltages.
  15. XeRun 3660 G2 Brushless DC Motor
    • Uses DC electric power supply to convert electrical energy to mechanical energy.
  16. Power Switch
  17. Battery Voltage Checker
    • For alarm notice when the battery is low.
  18. DC barrel to XT30 Connector
  19. XT60 Y-splitter
  20. Logitech F710 Wireless Gamepad


Our autonomous vehicle utilizes the YOLO algorithm (You Only Look Once: real-time object detection that classifies objects in a single frame). This process divides images into an NxN grid where each cell is processed by the model. Each cell is then assigned a probability of belonging to a certain class. Our YOLO algorithm is trained on the COCO dataset which can classify (4) different types of road-vehicles: {car, motorbike, bus, truck}. A Deep Neural Network (DNN) is loaded using a model trained on 320x320 image objects from the COCO dataset, along with the pre-trained weights. Once an image is read (which will come from a video frame), the postProcess() method will detect objects based on the network output. Our project used followed the following tutorial: https://techvidvan.com/tutorials/opencv-vehicle-detection-classification-counting/

Yolo trucks.png

Code and all file resources can be found within our project's GitHub repository:


Flow Chart

Flow chart v1.png

Final Presentation


Plans for Version 2.0

  • Switch codebase to one that is tailored to the Jetson Nano due to webcam issues.
  • Adjust the sampling rate for incoming video.
  • Modify the realTime() method to call the pauseMotor() and playAudio() methods when a bus is detected instead of counting all the occurrences of road-vehicles.
  • Add hinge design and increase acrylic thickness for the base plate.
  • Acknowledgements

    Special THANKS to all those who made this project possible:

  • Professor Jack Silberman
  • TA Ivan Ferrier
  • TA Dominic Nightingale
  • ECE Makerspace
  • TechVidvan