From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search

Meet the Misfits - Team 4

IMG 3775.jpg

From Left to Right: Felipe Ramos - Mechanical Engineering, Esther Grossman - Electrical & Computer Engineering, Paul Bae - Aerospace Engineering

Mistfit Car

IMG 3707.jpg


Mechanical Design

Mount Plate


Camera Mount

Jetson Nano Case


Chassis Hinge

Hood Bumper




Lidar Housing



Circuit Design

S21 team4 circuit.png

DonkeyCar Autonomous Racing

{{#evu:https://www.youtube.com/watch?v=aVhgmDBfa-E?rel=0 |alignment=left }}

The footage found here for the autonomous three laps using Donkey Car used a model in which data sets with different purposes were used together to get this working set. The foundation of this model used data that only focused on following the yellow line. This meant that the racing line wasn’t always necessarily smooth. With this in mind, this is where the other sets of data came into play. By mixing in data in which the racing line was prioritized, the model learned to take a path that focused on stability, all while following the yellow line.

OpenCV Autonomous Racing

The objective of our next autonomous racing model was to perform laps around the same track using python OpenCV segmentation. The process was run using ROS Noetic on a docker container created by Dominic.

We first isolated the white lanes of the track, with the intention of doing lane guidance by following the center of all the white lines isolated in the images. We additionally included the dashed yellow center line, which was not a problem since it falls in the center of the track and therefore provided additional guidance data. The algorithm identified the white and yellow contours within a pre-sized field of vision. Then, it drew bounding boxes around the contours and removed boxes below a designated width in order to reduce noise. The centroid of each box was found and its horizontal distance from the center of the car's field of vision was calculated. If the average of the boxes' distances was more than a preset error threshold, the car would assume it is turning and reduce its throttle. When it was within the error_threshold, it would increase its throttle. The steering value was taken as a linear function of the average distance from center.

Isolating white and yellow lines produced too much noise that steered the car off route. We decided to isolate only the yellow center line of the track. This produced a more reliable model. Of course, if the car steered off the track and could no longer see yellow lines, it had little hope of returning to the track, but this was not a frequent problem.

The final issue encountered was the recognition of cones. The car could often not distinguish between the yellow center line and the orange cones, leading it to steer toward the cones in a head-on collision. We added a layer of pre-processing in openCV to remove the orange cones and replace them with black pixels. This led the car to avoid all but one cone, whose pixels were not entirely erased in pre-processing due to its color shading. Nonetheless, a little tweaking of the yellow center line segmentation resulted in good success (balancing the stringency of the criteria to be considered yellow and the need to detect an adequate number of yellow lines).

{{#evu:https://www.youtube.com/watch?v=1kJeqHcrquc?rel=0 |alignment=left |center| }}

Final Project: Obstacle Avoidance Using LIDAR

For our final project, we experimented with a 360 degree RPlidar.

{{#evu:https://www.youtube.com/watch?v=pJxKa6_FetI?rel=0 |alignment=left |center| }}

Python Code

        #!/usr/bin/env python
	# This program implements a path planner and PID controller loosely based on artificial potential field methods
	import rospy
	from std_msgs.msg import Float32, Int32, Int32MultiArray
	from sensor_msgs.msg import LaserScan
	import numpy as np
	from numpy import pi
	from adafruit_servokit import ServoKit

	OBSTACLE_DETECTION_NODE_NAME = 'obstacle_detection_node'
	STEERING_TOPIC_NAME = '/steering'
	THROTTLE_TOPIC_NAME = '/throttle'
	CENTROID_TOPIC_NAME = '/centroid'
	SCAN_TOPIC_NAME = '/scan'
	kit = ServoKit(channels =16)

	class PathPlanner:
	    def __init__(self):
		# Initialize node and create publishers/subscribers
		self.init_node = rospy.init_node(OBSTACLE_DETECTION_NODE_NAME, anonymous=False)
		self.steering_publisher = rospy.Publisher(STEERING_TOPIC_NAME, Float32, queue_size=1)
		self.throttle_publisher = rospy.Publisher(THROTTLE_TOPIC_NAME, Float32, queue_size=1)
		self.steering_float = Float32()
		self.throttle_float = Float32()
		self.scan_subscriber = rospy.Subscriber(SCAN_TOPIC_NAME, LaserScan, self.controller)

		# Calibrate these parameters
		self.error_threshold = 5 #degrees
		self.maxSteerL = 40 #degrees
		self.maxSteerR = 145 #degrees
		self.SteerC = 90 #degrees
		self.zero_throttle = 0.0
		self.fast_throttle = 0.15
		self.slow_throttle = 0.145
		self.max_obstacle_influence = 1.5 #meters

		# # Display Parameters
		    f'\nno_error_throttle: {self.fast_throttle}'
		    f'\nnerror_throttle: {self.slow_throttle}')

	    def controller(self, scandata):

		#Facing forward is 0 radians for RPlidar. Remap range to 0 is left and 180 is right.
		left_angle = -pi/3; right_angle = pi/3
		first_half_ranges = scandata.ranges[int(left_angle/scandata.angle_increment):]
		second_half_ranges = scandata.ranges[:int(right_angle/scandata.angle_increment)]
		forward_ranges = np.concatenate((first_half_ranges,second_half_ranges))
		range_min = np.amin(forward_ranges)
		range_min_index = np.argmin(forward_ranges)
		range_min_angle = (range_min_index * scandata.angle_increment * 180/pi)+30
		wheel_angle = kit.servo[1].angle
		rospy.logerr(wheel_angle) #Display wheel_angle each iteration

		#Determine if obstacle is worth avoiding, then determine if it is on then
		#left or right of the robots field of vision. Ajust steering to go parallel
		#to any obstacles
		if range_min <= self.max_obstacle_influence:
		    throttle_float = self.slow_throttle
		    kp = 1.5*np.exp(-range_min) #steering sensitivity
		    if range_min_angle < 90:
		        steer_angle = 90 + range_min_angle #angle perp to obstacle surface
		        obstacle_wheel_diff = wheel_angle - steer_angle
		        if obstacle_wheel_diff < 0 :
		            wheel_angle = wheel_angle + kp*abs(obstacle_wheel_diff)
		            wheel_angle = wheel_angle - kp*abs(obstacle_wheel_diff)
		        steer_angle = -90 + range_min_angle #angle perp to obstacle surface
		        obstacle_wheel_diff = wheel_angle - steer_angle
		        if obstacle_wheel_diff > 0 :
		            wheel_angle = wheel_angle - kp*abs(obstacle_wheel_diff)
		        else :
		            wheel_angle = wheel_angle + kp*abs(obstacle_wheel_diff)
		    throttle_float = self.fast_throttle

		#Send out the normalized steering instructions
		if wheel_angle > self.maxSteerR:
		    wheel_angle = self.maxSteerR
		elif wheel_angle < self.maxSteerL:
		    wheel_angle = self.maxSteerL
		steering_float = - (wheel_angle/90 - 1)
		self.steering_float.data = steering_float 
		self.throttle_float.data = throttle_float

	    def clean_shutdown(self):
		print("Shutting Down...")
		self.throttle_float.data = self.zero_throttle
		print("Shut down complete")

	def main():
	    path_planner = PathPlanner()
	    rate = rospy.Rate(15)
	    while not rospy.is_shutdown():

	if __name__ == '__main__':