Difference between revisions of "2021SpringTeam6"

From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to: navigation, search
(Source Code)
(Source Code)
Line 64: Line 64:
== Source Code ==
== Source Code ==
#!/usr/bin/env python
#!/usr/bin/env python
import rospy
import rospy
Line 138: Line 138:

Revision as of 19:43, 10 June 2021

Project Introduction

One of the things that we noticed with the Donkey car and ROS car, that would be a major problem for an actual passenger vehicle is the lack of object avoidance or the ability to stop during emergencies. While more expensive sensors exist for implementing object avoidance, like LIDAR, we thought that less expensive solutions could be made. For implementing object avoidance on our car, we chose to go with the inexpensive HC-SRC04 Ultrasonic Sensor.

Project Team Members

  • Dylan Perlson (ECE)
  • Harrison Lew (MAE)
  • Joshua Guy (MAE)
  • Arman Mansourian (CSE)



  • NVIDIA Jetson Nano
  • ELP H.264 Low Illumination Sony IMX 322 1080P USB Camera
  • Adafruit PWM Controller
  • Electronic Speed Controller
  • 12V-5V DC-DC voltage converter
  • Throttle DC motor
  • Steering Servo Motor
  • 11.1V 3S LiPo Battery
  • Voltage Sensor
  • Remote relay
  • HC-SRC04 Ultrasonic Sensor

Physical Design

  • Baseplate: For the baseplate, we laser cut it using acrylic. We placed multiple holes on the plate so that way we were able to change the positioning of our original camera mount as we experimented with the car design.
  • Camera Mount v1: The first iteration of the camera mount was designed for flexibility; it allowed for the height, angle, and horizontal placement of the camera to be changed. This design was for testing camera placement only; the second iteration focused more on durability.

Old mount.png

  • Camera Mount v2: From our second iteration we opted for a new design that protected the camera better using the angle and height that we found from the previous camera design

New mount.png


Wiring Schematic

We used the wiring schematic from Fall 2020 Team 2 to get a head start on our wiring and allow us time to implement our additional components. Our final circuit was similar to theirs except included the ultrasonic sensor that we implemented in our design with a voltage divider for conditioning the signal to a readable level for the Jetson Nano Arman.png



For OpenCV, we had to modify several parameters for our implementation. We had to calibrate the pwm values for the throttle motor and for the steering servo and find appropriate control values for achieving the turning performance that we wanted. We also had to calibrate the color filters and max/min width of line detection. A consistent issue that occurred and had to be dealt with was the changing light conditions of the tent in which testing was done, meaning that the color calibration had to constantly change.

ROS (Robot Operating System)

For ROS, we based our implementation off of the code that Dominic provided. The code that was provided had nodes for throttle, steering, lane detection, lane guidance, and camera server. Combined, these nodes allowed the car to drive within the track and follow the lanes after proper color calibration. Our project expanded on this autonomous behavior by adding an additional node that used the ultrasonic sensor and relayed distance information for objects in front of the vehicle. The ultrasonic sensor node published distance readings and the lane guidance node subscribed to it.


The goal of our algorithm was to adjust the throttle so that the car would be able to slow down when an object approached and have an emergency stop for anything that abruptly comes in front. To do this, we used the Ultrasonic Sensor to continuously read the distance of any object in front of the car. Any object 150cm away or further did not affect the car. Any object detected between 150cm and 75 cm was linearly scaled down to a throttle of zero using a multiplier: alpha = (distance_detected - 75cm) / (150cm - 75cm). This way the car would come to a gradual stop if an object was detected far enough away. However, if an object fell in front of the car at 75cm or less, the car would immediately stop.

One problem we ran into was that the Ultrasonic sensor was very noisy and its values would jump around alot. To solve this, we added a moving average to the distance input to smooth out the signal. After adding this, the car would no longer accidentally come to an abrupt stop if there was any random noise that caused the signal to be lower than it should have been.



  • Early on, we ran into issues with our wiring. Many of the jumper cables we used were faulty and inconsistent in connecting our circuitry. In order to solve the issue, we had to go through and replace all the connections in our circuit and test their continuity with a multimeter.
  • One of our Jetson Nano’s broke in a crash and was rendered unusable. From then, we proceeded with the backup which was a slightly different model. Occasionally, we ran into strange compatibility issues that stemmed from this model difference.
  • Our first camera broke in a crash which made us realize that we needed to redesign our camera mount to protect the camera more
  • When installing the ultrasonic sensor on our car, we ran into problems running the sensor directly into the pins of the Jetson Nano. This would result in the car giving readings for a brief period of time before losing distance measurements altogether.
  • When driving with ROS our group ran into inconsistencies with our car recognizing the lines on the track, most likely due to changes in lighting. We would configure the car and it would track/recognize the lines well for a short while, but after an hour or so outside lighting would cause the car to no longer follow the lines.

Further Improvements

  • We eventually figured out a work-around to the loss of distance measurements mentioned above, but if we were to improve upon the design we would implement a microcontroller such as Arduino to collect the measurements instead of connecting the ultrasonic sensor to the GPIO pins on the Jetson Nano.
  • Better filtering in ROS might allow us to avoid inconsistencies in our cars navigation due to changes in lighting. If we were able to solve this problem and have a more robust line tracking system within ROS than we were able to accomplish it would greatly improve the consistency of the vehicle.

Source Code

#!/usr/bin/env python
import rospy
import numpy as np
from std_msgs.msg import Float32, Int32, Int32MultiArray

LANE_GUIDANCE_NODE_NAME = 'lane_guidance_node'
SONAR_TOPIC_NAME = '/sonar_dist'

global steering_float, throttle_float
steering_float = Float32()
throttle_float = Float32()

dist_data = None
data_smooth = [0,0,0,0,0]

def callback(data):
    global dist_data
    dist_data = data.data

def LineFollower(msg):
    kp = 0.25
    global steering_float, throttle_float, dist_data
    steering_float = Float32()
    throttle_float = Float32()
    centroid = msg.data[0]
    width = msg.data[1]  # width of camera frame

    driving = 0.040
    dist_data = np.mean(data_smooth)

    if dist_data and dist_data < 75:
        error_x = 0
        throttle_float = 0
    elif dist_data and dist_data < 150:
        error_x = 0
        throttle_float = driving * (dist_data-75)/(150-75) 
    elif msg.data == 0:
        error_x = 0
        throttle_float = driving 
        error_x = float(centroid - (width / 2))
        throttle_float = 0.040

    steering_float = float(kp * (error_x / (width / 2)) + 0.070)
    if steering_float < -1:
        steering_float = -1
    elif steering_float > 1:
        steering_float = 1

if __name__ == '__main__':
    rospy.init_node(LANE_GUIDANCE_NODE_NAME, anonymous=False)
    centroid_subscriber = rospy.Subscriber(CENTROID_TOPIC_NAME, Int32MultiArray, LineFollower)
    distance_subscriber = rospy.Subscriber(SONAR_TOPIC_NAME, Float32, callback)
    steering_pub = rospy.Publisher(STEERING_TOPIC_NAME, Float32, queue_size=1)
    throttle_pub = rospy.Publisher(THROTTLE_TOPIC_NAME, Float32, queue_size=1)
    rate = rospy.Rate(15)
    while not rospy.is_shutdown():