From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search


Team Members

For our final project, we joined with Team 5 to create a team of 4, which we called Team 5.5. We've built everything separately except for the vehicle-to-vehicle communication final project. You can find much of the same final project info on their page as well.

Team 5:

Zoey Huang - Aerospace Engineering

Jason Almeda - Computer Engineering

Team 6:

Liam Fowler - Mechanical Engineering

Johannes Diehm - UCSD Extension

Liam (left) and Johannes (right)


Our robot design needed to incorporate mounts for multiple systems - the IMU, the Lidar Mount, the Camera Mount, and the Jetson Case. Our guiding design principle was easy access to internal cables and wiring, as well as protection from the numerous crashes we foresaw could happen frequently and which prior teams had warned us about.

Robot Assembly.jpg

We designed our electronics plate to allow for adjustability of our many components, to be well protected by the foam front bumper of the car, and to leave room for error and uncertainty in the mounting holes of our various components. The electronics plate was the only component laser cut in the ECE Makerspace. All other components were 3D printed.

Electronics Mount.jpg

We designed our camera mount to be highly adjustable, yet strong enough to withstand a direct crash with a chair or table (a common issue experienced by past teams). That's why we also included a protective front plate to protect the camera breakout board.

Camera Mount Assembly.jpg

We sourced our CAD for the Jetson Case from Thingverse and modified the bottom piece to create holes we could use to mount to the electronics mount.


Our IMU and Lidar Mounts were relatively simple and relied on force-threading the soft PLA 3D prints with M2 machine screws in order to create tapped holes.

alignment=left IMU Mount alignment=right Lidar Mount

Lastly, after witnessing hard collisions with classroom obstacles (chairs, desks, etc.), we decided to create additional protection for our lidar and electronics plate. We did this by 3D printing a second bumper mounted directly to the electronics plate. When a load is applied to the bumper, it bends the bumper back, absorbing the impact energy. The filaments of the 3D print are also aligned to maximize the bending strength of the part as well.

Secondary Bumper.jpg

Circuit Design

Donkey Car Laps

Donkey Car Laps, Twisty Track

Donkey Car Laps, Classroom Track

ROS Laps

ROS Laps

EMO Demonstration

EMO Demonstration

Our Project

Our goal was to develop a system of nodes in ROS to enable vehicle to vehicle communication between 2 cars. We took inspiration for our project from the "Black Ice" phenomenon, whereby cars in icy conditions slide out and crash due to difficult-to-see ice on the road. By allowing cars to communicate with one another, a car involved in a crash could tell other cars that there is an issue on the road ahead, allowing the cars to slow down or stop to avoid the road hazard.

Our project consists of two cars: the lead car and the follower car. If the lead car should crash, it shall publish a message to the follower car, which causes it to stop, thus preventing that the follower car from crashing into the same obstacle. For sensing the crash, we use the accelerometer sensor on an Openlog Artemis IMU in the lead car, which we initialized by adding the device when creating the Docker Container with the robot image. If a specific deceleration threshold is met (i.e. 10 milli g), the car publishes a message to a connected following car. If the follower car receives this specific message, it should cause an immediate stop.

In order to prevent hackers from slowing down or stopping the follower car without a road hazard being present, we also include a unique security string published by the lead car and subscribed to by the follower car. If the security string being sent to the follower car before a stop command does not match the expected security string stored on the follower car, then the follower car will not accept a message to stop, and will continue to move around the track.

In our case, we set the lead car to be the "Master" roscore, to which we connected the follower car (so that both cars use the same roscore). Both cars run many of the same nodes, clients, and servers as described in Dominic Nightingale's Simple ROS framework. However, in order to ensure the topics being published and subscribed do not cross cars (i.e. camera on one car is used by the lane guidance on the other car), we gave the each topic with the potential to cross over a new unique identifier.

Gantt Chart

Pasted Graphic.jpg

Our Code

You can find the GitHub repository to our code here: https://github.com/LiamEFowler/ECE-MAE-148-Group5.5-Black-Ice

In summary, we added a talker node on the lead car, a listener node on the follower car and the according launch files. In the talker node, we implemented a module to read the output of the IMU, filter it for the axis of acceleration that would give us frontal deceleration to detect crashes, and ignore acceleration due to road vibrations or centripetal acceleration while going around corners. After comparing this acceleration value to an acceleration expected in a crash (i.e. how an airbag works), the talker node publishes the security code and a message to continue running or to stop if the threshold was met. In the listener node we implemented a subscriber that read the security code and the message to stop or continue. If the security code was correct and a stop command was sent from the talker node, the listener node would publish a stop command to the lane guidance node. If the stop command was received by the lane guidance node, that node would publish a zero throttle to the ESC, stopping the follower car. talker_node.py

#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import rospy
import serial
import math
import time
from std_msgs.msg import String
security = 'I am Autonomous Prime, Autobots, roll out!' # Security code to confirm hacker is not$
CAMERA_FREQUENCY = 10 # in Hz, set sensing rate
accelerometerThreshold = -10.0 # set deceleration threshold in g where the stop command is sent
history = 'running' # Initialize history string
ser = serial.Serial(
  timeout=0) # Open Serial Port to the OLA
time.sleep(5) # Wait 5 seconds for serial port to initialize
# Read Data from IMU
def callback(ser):
  line = ser.readline()
  line = str(line, 'utf-8', errors='ignore')
  splitLine = line.split(',')
  if len(splitLine) < 2:
    a = 0
    a = float(splitLine[1])
 rospy.loginfo(f"/Acceleration: {a}")
 return a
 #b = a[0].decode('utf-8')
 #c = int.from_bytes(a[0], byteorder='little', signed=False) 
 #a = line.split(comma
 #return c
# Publish security and stop commands
def talker(history, ser):
  rospy.init_node('talker_node', anonymous=True)
  pubSecurity = rospy.Publisher('SecurityCode', String, queue_size=10)
  pubCommand = rospy.Publisher('CrashAlert', String, queue_size=10)
  rate = rospy.Rate(CAMERA_FREQUENCY)
  while not rospy.is_shutdown():
    # Publish Security Code to confirm correct correction
    acceleration = callback(ser)
    # Detect if there is a crash or slowdown, publish command 
    if acceleration < accelerometerThreshold or history == 'stopped':
      history = 'stopped'
      rospy.loginfo(f"/Command sent: {history}")
      rospy.loginfo(f"/Acceleration: {acceleration}")
      rospy.loginfo(f"/Command sent: {history}")
      rospy.loginfo(f"/Acceleration: {acceleration}")
if __name__ == '__main__':
        talker(history, ser)
    except rospy.ROSInterruptException:



#!/usr/bin/env python3
import rospy
from std_msgs.msg import String
sec = None
alert = None
stopSignal = 'continue'
def callbackSec(data):
    global sec
    sec = data.data
    rospy.loginfo(rospy.get_caller_id() + "%s", data.data)
def callbackAlert(data):
    global alert
    alert = data.data
    rospy.loginfo(rospy.get_caller_id() + "%s", data.data)
def listener():
    rospy.init_node('listener', anonymous=True)
    rospy.Subscriber("SecurityCode", String, callbackSec)
    rospy.Subscriber("CrashAlert", String, callbackAlert)
    #stopSignal = rospy.Publisher('ThrottleStop', String, queue_size=10)
    if sec == 'I am Autonomous Prime, Autobots, roll out!':
        if alert == 'stop':
def stop():
    global stopSignal
    stopSignal = 'stop'
if __name__ == '__main__':


Files Modified:

lane_guidance_node.py (follower car only)

ros_racer_launch.launch (both cars)

Weekly Presentations

Final Proposal Slides

11/23 Status Update

12/2 Status Update

Final Presentation

Final Presentation Slides


We have achieved to read our IMU, filter the output for the axis we want to read (frontal deceleration) and publish the values. Therefore we did set up the “talker_node.py“ and its corresponding launch file in ROS1 from scratch. We have also been able to connect our follower car to the network and read the published values from the lead car. We have programmed a so called “listener_node“ and its corresponding launch-file for reading the values and calling a newly developed function in the lane_guidance_node for stopping the car. We are confident that this method is fully functioning. We have implemented a security mechanism for the communication. In the listener node, we have specified that the “Stop“ method is only getting called if a specific String is attached to the published message. We are able to publish and receive this message. Unfortunately, we have not been able to test our system and get rid of the last bugs. We are confident that our mechanism of stopping the follower car after a crash of the lead car is in principle functioning, but in the end, even though we have worked all night, we haven't had enough time to finish our project. In the end, we have had an issue with messed up file names that stopped the follower car from working. We found out too late that the nodes are not allowed to have the same names on the lead and the follower car. If we had just a little more time, we could have checked all of our nodes and topics and renamed them. Because of that, we weren't able to test the system.

Further Development

Currently, the two cars are connected via ssh on the same Wifi network (UCSDRoboCar5GHz). Since this works only while on Wifi, a next step would be to implement an off-network communication system using ESP32s connected to each car's Jetson Nano SBC.

The system is also not particularly robust due to inconsistencies reading the serial data from the IMU and running the ROS packages. In the future, it would be helpful to make this system more reliable and easier to use.


Instructor - Prof. Jack Silberman

TA - Dominic Nightingale

TA - Haoru Xue

ECE Makerspace

Resources Used for Final Project

IMU Hookup Guide

Connecting Multiple ROS Machines

Rospy Tutorial

Pyserial Library Documentation

Jetson Nano Case (case used slightly modified from this one)

And countless Google queries for debugging...