Difference between revisions of "2021SpringTeam6"
|Line 63:||Line 63:|
== Source Code ==
== Source Code ==
Revision as of 02:41, 11 June 2021
Project Team Members
- Dylan Perlson (ECE)
- Harrison Lew (MAE)
- Joshua Guy (MAE)
- Arman Mansourian (CSE)
- NVIDIA Jetson Nano
- ELP H.264 Low Illumination Sony IMX 322 1080P USB Camera
- Adafruit PWM Controller
- Electronic Speed Controller
- 12V-5V DC-DC voltage converter
- Throttle DC motor
- Steering Servo Motor
- 11.1V 3S LiPo Battery
- Voltage Sensor
- Remote relay
- HC-SRC04 Ultrasonic Sensor
- Baseplate: For the baseplate, we laser cut it using acrylic. We placed multiple holes on the plate so that way we were able to change the positioning of our original camera mount as we experimented with the car design.
- Camera Mount v1: The first iteration of the camera mount was designed for flexibility; it allowed for the height, angle, and horizontal placement of the camera to be changed. This design was for testing camera placement only; the second iteration focused more on durability.
- Camera Mount v2: From our second iteration we opted for a new design that protected the camera better using the angle and height that we found from the previous camera design
- Jetson Nano case: To save time and not reinvent the wheel, we decided to use the nano case that was most popular on thingiverse. https://www.thingiverse.com/thing:3603594
We used the wiring schematic from Fall 2020 Team 2 to get a head start on our wiring and allow us time to implement our additional components. Our final circuit was similar to theirs except included the ultrasonic sensor that we implemented in our design with a voltage divider for conditioning the signal to a readable level for the Jetson Nano
For OpenCV, we had to modify several parameters for our implementation. We had to calibrate the pwm values for the throttle motor and for the steering servo and find appropriate control values for achieving the turning performance that we wanted. We also had to calibrate the color filters and max/min width of line detection. A consistent issue that occurred and had to be dealt with was the changing light conditions of the tent in which testing was done, meaning that the color calibration had to constantly change.
ROS (Robot Operating System)
For ROS, we based our implementation off of the code that Dominic provided. The code that was provided had nodes for throttle, steering, lane detection, lane guidance, and camera server. Combined, these nodes allowed the car to drive within the track and follow the lanes after proper color calibration. Our project expanded on this autonomous behavior by adding an additional node that used the ultrasonic sensor and relayed distance information for objects in front of the vehicle. The ultrasonic sensor node published distance readings and the lane guidance node subscribed to it.
The goal of our algorithm was to adjust the throttle so that the car would be able to slow down when an object approached and have an emergency stop for anything that abruptly comes in front. To do this, we used the Ultrasonic Sensor to continuously read the distance of any object in front of the car. Any object 150cm away or further did not affect the car. Any object detected between 150cm and 75 cm was linearly scaled down to a throttle of zero using a multiplier: alpha = (distance_detected - 75cm) / (150cm - 75cm). This way the car would come to a gradual stop if an object was detected far enough away. However, if an object fell in front of the car at 75cm or less, the car would immediately stop.
One problem we ran into was that the Ultrasonic sensor was very noisy and its values would jump around alot. To solve this, we added a moving average to the distance input to smooth out the signal. After adding this, the car would no longer accidentally come to an abrupt stop if there was any random noise that caused the signal to be lower than it should have been.
- Early on, we ran into issues with our wiring. Many of the jumper cables we used were faulty and inconsistent in connecting our circuitry. In order to solve the issue, we had to go through and replace all the connections in our circuit and test their continuity with a multimeter.
- One of our Jetson Nano’s broke in a crash and was rendered unusable. From then, we proceeded with the backup which was a slightly different model. Occasionally, we ran into strange compatibility issues that stemmed from this model difference.
- Our first camera broke in a crash which made us realize that we needed to redesign our camera mount to protect the camera more
- When installing the ultrasonic sensor on our car, we ran into problems running the sensor directly into the pins of the Jetson Nano. This would result in the car giving readings for a brief period of time before losing distance measurements altogether.
- When driving with ROS our group ran into inconsistencies with our car recognizing the lines on the track, most likely due to changes in lighting. We would configure the car and it would track/recognize the lines well for a short while, but after an hour or so outside lighting would cause the car to no longer follow the lines.
- We eventually figured out a work-around to the loss of distance measurements mentioned above, but if we were to improve upon the design we would implement a microcontroller such as Arduino to collect the measurements instead of connecting the ultrasonic sensor to the GPIO pins on the Jetson Nano.
- Better filtering in ROS might allow us to avoid inconsistencies in our cars navigation due to changes in lighting. If we were able to solve this problem and have a more robust line tracking system within ROS than we were able to accomplish it would greatly improve the consistency of the vehicle.
- !/usr/bin/env python
import rospy import numpy as np from std_msgs.msg import Float32, Int32, Int32MultiArray
LANE_GUIDANCE_NODE_NAME = 'lane_guidance_node' STEERING_TOPIC_NAME = '/steering' THROTTLE_TOPIC_NAME = '/throttle' CENTROID_TOPIC_NAME = '/centroid' SONAR_TOPIC_NAME = '/sonar_dist'
global steering_float, throttle_float steering_float = Float32() throttle_float = Float32()
dist_data = None data_smooth = [0,0,0,0,0]
global dist_data dist_data = data.data
kp = 0.25 global steering_float, throttle_float, dist_data steering_float = Float32() throttle_float = Float32() centroid = msg.data width = msg.data # width of camera frame
driving = 0.040
data_smooth.append(dist_data) data_smooth.pop(0) dist_data = np.mean(data_smooth)
if dist_data and dist_data < 75: error_x = 0 throttle_float = 0 elif dist_data and dist_data < 150: error_x = 0 throttle_float = driving * (dist_data-75)/(150-75) elif msg.data == 0: error_x = 0 throttle_float = driving else: error_x = float(centroid - (width / 2)) throttle_float = 0.040
steering_float = float(kp * (error_x / (width / 2)) + 0.070) if steering_float < -1: steering_float = -1 elif steering_float > 1: steering_float = 1 else: pass steering_pub.publish(steering_float) throttle_pub.publish(throttle_float)
if __name__ == '__main__':
rospy.init_node(LANE_GUIDANCE_NODE_NAME, anonymous=False) centroid_subscriber = rospy.Subscriber(CENTROID_TOPIC_NAME, Int32MultiArray, LineFollower) distance_subscriber = rospy.Subscriber(SONAR_TOPIC_NAME, Float32, callback) steering_pub = rospy.Publisher(STEERING_TOPIC_NAME, Float32, queue_size=1) throttle_pub = rospy.Publisher(THROTTLE_TOPIC_NAME, Float32, queue_size=1) rate = rospy.Rate(15) while not rospy.is_shutdown(): rospy.spin() rate.sleep()