From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search

Team 4: Adaptive Cruise Control

Hussein Batteikh (ECE)

Jon Carlo Bergado Bibat (CE)

Haojin Chen (MAE)

David Qiao (MAE)

Physical Robot

Construction Design


The plate used has the specifications of m3 holes being 1.5 centimeters apart. There are 4 curved edges around the sides of the plate which connect to the 4 legs that were 3D printed to give more room for the wiring and raising the camera, jetson, and lidar to get the best view and data collection processes. There are extended holes on the edges of the base plate to allow better wiring adjustments and attaching the Jetson Nano at the rear side of the plate in order to place the LiDar and Camera mounts at the front. The camera mount was designed as one curved leg connected to two adjustable arms in which they would move the camera up and down to calibrate for the best viewing angle. The lidar mount is attached at the front of the plate with two m3 screws diagonally to support the weight of the lidar.

Electrical Schematic


Completed Car


Donkey 3 Autonomous Laps


ROS2 3 Autonomous Laps


Final Project

Gantt Chart



The goal is to program the robocar using ROS2 with the ability to detect the speed of a moving or stationary object in front of it through the LiDAR and have it adjust its speed accordingly to avoid collision while driving autonomously, as well as filtering out the front vision so that the object in the front does not confuse the training.


Vehicle will drive autonomously through lane tracing.

The vehicle’s central processing unit will detect the center of a lane through an image provided by a RGB camera.

The image will be processed through OpenCV where the center of the lane will be calculated.

Once the center of the lane is calculated, steering will be adjusted based on the error away from the vehicle’s center.

Vehicle will adjust throttle speed based on the distance from of an object in front of it.

LiDAR will distance data of any object in front of the vehicle’s frontal view range.

As a frontal object gets closer, throttle value will be adjusted to go slower or a complete stop.

A first order filter was applied to the LiDAR readings to reduce noisy data for better stability.

ROS2 Implementation

Overview: We modified the lane guidance node from the lane detection package in order to receive data points from the LiDAR and use them to adjust the throttle and keep a certain distance between our car and any object it detects in front of it. We used an algorithm that adjusts the throttle accordingly and added a first order filter to reduce the noise and keep the throttle more stable, otherwise it would keep fluctuating and not keep the throttle at a certain level.

Constants and imports.JPG

We added the math module in order to make some calculations for our scheduling function and the LaserScan module in order to read in values from the LiDAR. We also added some new constants, can be turned into ROS parameters as these are used in order to calibrate the throttle scheduling as well as our filter.

New instance variables.JPG

We also added some new instance variables that we use to set a limit on the throttle based on the calibrated zero throttle as well as set the slope for our scheduling function.

Subscribe code.JPG

For the cruise control node, we published to the actuator topic and subscribed to both the centroid and scan topics, the latter of which is used to take in the LiDAR values. The LiDAR subscriber also calls the cruise_controller callback function every time new values are received.

Cruise control code.JPG

This contains the code for the cruise_controller callback function which takes in the data within a certain range in front of the LiDAR device and implements the scheduling function in the form of if statement logic and publishing the resulting throttle to the actuator topic.

Throttle Schedule Algorithm


Testing text

Demonstration Video



Professor Jack Silberman, TA Dominic and Ivan, ECE Makerspace