Difference between revisions of "2021SpringTeam6"

From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search
Line 63: Line 63:


== Source Code ==
== Source Code ==
Here we have a modified version of the lane_guidance node provided by Dominic. We included a subscriber to our new sonar node which takes in the sonar value and appends it to a running list, this list is used to average out sonar readings to better accuracy. Then we use the sonar value when setting the throttle based on the varying distance from nearest object. The speed scales inversely to distance reading from sonar.
lane_guidance.py
lane_guidance.py
<pre>
<pre>
Line 139: Line 142:
         rate.sleep()
         rate.sleep()
</pre>
</pre>
Here we have the code for our ROS sonar node. This node publishes a topic which outputs the raw sonar data and is then subscribed to and read in by the lane_detection node as described above. This code was pulled from the following github repo and modified slightly for our specific use case: https://github.com/engcang/HC-SR04-UltraSonicSensor-ROS-RaspberryPi
ROS_sonar_sensor.py
ROS_sonar_sensor.py
<pre>
<pre>

Revision as of 02:50, 11 June 2021

Project Introduction

One of the things that we noticed with the Donkey car and ROS car, that would be a major problem for an actual passenger vehicle is the lack of object avoidance or the ability to stop during emergencies. While more expensive sensors exist for implementing object avoidance, like LIDAR, we thought that less expensive solutions could be made. For implementing object avoidance on our car, we chose to go with the inexpensive HC-SRC04 Ultrasonic Sensor.

Project Team Members

  • Dylan Perlson (ECE)
  • Harrison Lew (MAE)
  • Joshua Guy (MAE)
  • Arman Mansourian (CSE)

Hardware

Components

  • NVIDIA Jetson Nano
  • ELP H.264 Low Illumination Sony IMX 322 1080P USB Camera
  • Adafruit PWM Controller
  • Electronic Speed Controller
  • 12V-5V DC-DC voltage converter
  • Throttle DC motor
  • Steering Servo Motor
  • 11.1V 3S LiPo Battery
  • Voltage Sensor
  • Remote relay
  • HC-SRC04 Ultrasonic Sensor

Physical Design

  • Baseplate: For the baseplate, we laser cut it using acrylic. We placed multiple holes on the plate so that way we were able to change the positioning of our original camera mount as we experimented with the car design.
  • Camera Mount v1: The first iteration of the camera mount was designed for flexibility; it allowed for the height, angle, and horizontal placement of the camera to be changed. This design was for testing camera placement only; the second iteration focused more on durability.

Old mount.png

  • Camera Mount v2: From our second iteration we opted for a new design that protected the camera better using the angle and height that we found from the previous camera design

New mount.png

Case.png

Wiring Schematic

We used the wiring schematic from Fall 2020 Team 2 to get a head start on our wiring and allow us time to implement our additional components. Our final circuit was similar to theirs except included the ultrasonic sensor that we implemented in our design with a voltage divider for conditioning the signal to a readable level for the Jetson Nano Arman.png

Software

OpenCV

For OpenCV, we had to modify several parameters for our implementation. We had to calibrate the pwm values for the throttle motor and for the steering servo and find appropriate control values for achieving the turning performance that we wanted. We also had to calibrate the color filters and max/min width of line detection. A consistent issue that occurred and had to be dealt with was the changing light conditions of the tent in which testing was done, meaning that the color calibration had to constantly change.

ROS (Robot Operating System)

For ROS, we based our implementation off of the code that Dominic provided. The code that was provided had nodes for throttle, steering, lane detection, lane guidance, and camera server. Combined, these nodes allowed the car to drive within the track and follow the lanes after proper color calibration. Our project expanded on this autonomous behavior by adding an additional node that used the ultrasonic sensor and relayed distance information for objects in front of the vehicle. The ultrasonic sensor node published distance readings and the lane guidance node subscribed to it.

Algorithm

The goal of our algorithm was to adjust the throttle so that the car would be able to slow down when an object approached and have an emergency stop for anything that abruptly comes in front. To do this, we used the Ultrasonic Sensor to continuously read the distance of any object in front of the car. Any object 150cm away or further did not affect the car. Any object detected between 150cm and 75 cm was linearly scaled down to a throttle of zero using a multiplier: alpha = (distance_detected - 75cm) / (150cm - 75cm). This way the car would come to a gradual stop if an object was detected far enough away. However, if an object fell in front of the car at 75cm or less, the car would immediately stop.

One problem we ran into was that the Ultrasonic sensor was very noisy and its values would jump around alot. To solve this, we added a moving average to the distance input to smooth out the signal. After adding this, the car would no longer accidentally come to an abrupt stop if there was any random noise that caused the signal to be lower than it should have been.

Demos

File:Vid1 Trim.mp4

Obstacles

  • Early on, we ran into issues with our wiring. Many of the jumper cables we used were faulty and inconsistent in connecting our circuitry. In order to solve the issue, we had to go through and replace all the connections in our circuit and test their continuity with a multimeter.
  • One of our Jetson Nano’s broke in a crash and was rendered unusable. From then, we proceeded with the backup which was a slightly different model. Occasionally, we ran into strange compatibility issues that stemmed from this model difference.
  • Our first camera broke in a crash which made us realize that we needed to redesign our camera mount to protect the camera more
  • When installing the ultrasonic sensor on our car, we ran into problems running the sensor directly into the pins of the Jetson Nano. This would result in the car giving readings for a brief period of time before losing distance measurements altogether.
  • When driving with ROS our group ran into inconsistencies with our car recognizing the lines on the track, most likely due to changes in lighting. We would configure the car and it would track/recognize the lines well for a short while, but after an hour or so outside lighting would cause the car to no longer follow the lines.

Further Improvements

  • We eventually figured out a work-around to the loss of distance measurements mentioned above, but if we were to improve upon the design we would implement a microcontroller such as Arduino to collect the measurements instead of connecting the ultrasonic sensor to the GPIO pins on the Jetson Nano.
  • Better filtering in ROS might allow us to avoid inconsistencies in our cars navigation due to changes in lighting. If we were able to solve this problem and have a more robust line tracking system within ROS than we were able to accomplish it would greatly improve the consistency of the vehicle.

Source Code

Here we have a modified version of the lane_guidance node provided by Dominic. We included a subscriber to our new sonar node which takes in the sonar value and appends it to a running list, this list is used to average out sonar readings to better accuracy. Then we use the sonar value when setting the throttle based on the varying distance from nearest object. The speed scales inversely to distance reading from sonar.

lane_guidance.py

#!/usr/bin/env python
import rospy
import numpy as np
from std_msgs.msg import Float32, Int32, Int32MultiArray


LANE_GUIDANCE_NODE_NAME = 'lane_guidance_node'
STEERING_TOPIC_NAME = '/steering'
THROTTLE_TOPIC_NAME = '/throttle'
CENTROID_TOPIC_NAME = '/centroid'
SONAR_TOPIC_NAME = '/sonar_dist'

global steering_float, throttle_float
steering_float = Float32()
throttle_float = Float32()

dist_data = None
data_smooth = [0,0,0,0,0]

def callback(data):
    global dist_data
    dist_data = data.data


def LineFollower(msg):
    kp = 0.25
    global steering_float, throttle_float, dist_data
    steering_float = Float32()
    throttle_float = Float32()
    centroid = msg.data[0]
    width = msg.data[1]  # width of camera frame
    

    driving = 0.040
	
    data_smooth.append(dist_data)
    data_smooth.pop(0)
    dist_data = np.mean(data_smooth)

    if dist_data and dist_data < 75:
        error_x = 0
        throttle_float = 0
    elif dist_data and dist_data < 150:
        error_x = 0
        throttle_float = driving * (dist_data-75)/(150-75) 
    elif msg.data == 0:
        error_x = 0
        throttle_float = driving 
    else:
        error_x = float(centroid - (width / 2))
        throttle_float = 0.040

    steering_float = float(kp * (error_x / (width / 2)) + 0.070)
    if steering_float < -1:
        steering_float = -1
    elif steering_float > 1:
        steering_float = 1
    else:
        pass
    steering_pub.publish(steering_float)
    throttle_pub.publish(throttle_float)


if __name__ == '__main__':
    rospy.init_node(LANE_GUIDANCE_NODE_NAME, anonymous=False)
    centroid_subscriber = rospy.Subscriber(CENTROID_TOPIC_NAME, Int32MultiArray, LineFollower)
    distance_subscriber = rospy.Subscriber(SONAR_TOPIC_NAME, Float32, callback)
    steering_pub = rospy.Publisher(STEERING_TOPIC_NAME, Float32, queue_size=1)
    throttle_pub = rospy.Publisher(THROTTLE_TOPIC_NAME, Float32, queue_size=1)
    rate = rospy.Rate(15)
    while not rospy.is_shutdown():
        rospy.spin()
        rate.sleep()

Here we have the code for our ROS sonar node. This node publishes a topic which outputs the raw sonar data and is then subscribed to and read in by the lane_detection node as described above. This code was pulled from the following github repo and modified slightly for our specific use case: https://github.com/engcang/HC-SR04-UltraSonicSensor-ROS-RaspberryPi

ROS_sonar_sensor.py

#!/usr/bin/env python

import Jetson.GPIO as gpio
import time
import sys
import signal
import rospy
from std_msgs.msg import Float32

def signal_handler(signal, frame): # ctrl + c -> exit program
        print('You pressed Ctrl+C!')
        sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)


class sonar():
    def __init__(self):
        rospy.init_node('sonar', anonymous=True)
        self.distance_publisher = rospy.Publisher('/sonar_dist',Float32, queue_size=1)
        self.r = rospy.Rate(15)
    def dist_sendor(self,dist):
        data = Float32()
        data.data=dist
        self.distance_publisher.publish(data)
        
        
gpio.setmode(gpio.BOARD)
trig = 12 # 7th
echo = 21 # 6th

gpio.setup(trig, gpio.OUT)
gpio.setup(echo, gpio.IN)

sensor=sonar()
time.sleep(0.5)
print ('-----------------------------------------------------------------sonar start')
try :
    while True :
        gpio.output(trig, False)
        time.sleep(0.1)
        gpio.output(trig, True)
        time.sleep(0.00001)
        gpio.output(trig, False)
        n=0
        while gpio.input(echo) == 0 :
            n+=1
            if n>10: break
            pulse_start = time.time()
        while gpio.input(echo) == 1 :
            pulse_end = time.time()
        pulse_duration = pulse_end - pulse_start
        distance = pulse_duration * 17000
        if pulse_duration >=0.01746:
            print('time out')
            continue
        elif distance > 300 or distance<=0:
            print('out of range')
            continue
        distance = round(distance, 3)
        print ('Distance : %f cm'%distance)
        sensor.dist_sendor(distance)
        
        sensor.r.sleep()
        #gpio.cleanup()
        
except (KeyboardInterrupt, SystemExit):
    gpio.cleanup()
    sys.exit(0)
except:
    gpio.cleanup()