2021WinterTeam1

From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search

Team Members

Brandon Bao (MAE) Evon Yip (ECE) Sarah Imai(ECE)

Objective

Implement SLAM on the RC using the rp lidar and gmapping

Using lane guidance from ucsd_robo_car_ros package (provided by Dominic Nightingale) integrate particle filter SLAM outputs to guide the steering based on what obstacles are presented

Use RVIZ to debug ROS algorithms / functionality

Sensors/Hardware

RP Lidar: The laser scans find the distance to the surroundings that eventually forms a map using SLAM (Gmapping package).

IMU (Inertial Measurement Unit): We are using the SparkFun OpenLog Artemis IMU which outputs 6DOF acceleration and gryo data. https://www.adafruit.com/product/4754

Camera: The ELP Mini USB camera images feed into a lane tracking program which tracks the center of a lane through centroid computations.

Jetson Nano: The Jetson Nano is a single board computer (SBC) that acts as a central microcontroller

PWM/Servo Driver PCA9685: i2c controlled with a built in clock. Controls the throttle and steering of the car using 2 channels

Chassis:

Team1.1.png

Brushless Motor: for throttle Servo : receives input from PWM for steering control Motor drivers (ESC) : connected to PWM and stops the motor(emergency stop) Lipo Battery and battery charger

Schematic

Team1.2.png


Final Setup

Team1.3.png

Team1.4.png

3D models and laser cuting:

We lasercut a simple base from acrylic, and 3D printed a camera mount, Jetson Nano mount, RP Lidar mount, and an IMU mount. We found that a relatively tall but forward looking camera was sufficient for line following and the earlier Donkey Car deep learning programs.

Team1.7.png

Software

Packages and Drivers:

Slam_Gmapping: http://wiki.ros.org/gmapping Dependencies: https://github.com/ros-perception/openslam_gmapping

RPLidar: http://wiki.ros.org/rplidar

IMU driver: https://learn.sparkfun.com/tutorials/how-to-install-ch340-drivers?_ga=2.150182644.393793206.1614466573-1572058972.1607974083, under other Linux Distributions Courtesy of Layne Clemen. More info: https://learn.sparkfun.com/tutorials/openlog-artemis-hookup-guide/configuration

Odom publisher: https://gist.github.com/atotto/f2754f75bedb6ea56e3e0264ec405dcf

ROS lane guidance:https://gitlab.com/djnighti/ucsd_robo_car_simple_ros

Algorithms:

Used gmapping-ros package combined with rplidar and IMU for mapping and motion detection. Particle filter SLAM (simultaneous localization and mapping) keeps track of a motion model of the car and its surroundings to simultaneously create a map and locate where the car is in relation to the map. The ‘particle’ in particle filters represent one instance of a hypothesis of its location-- many particles are generated by accounting for the noise from the sensor readings, and the particle trajectory that most agree with the generated map is weighted to be more reliable. Particle resampling happens when certain particle weights are lower than a given threshold, resulting in more particles generated in the location with the higher weights. Luckily, there was no need to implement this algorithm, thanks to the gmapping package created by Brian Gerkey. gmapping requires a node to publish the relationship between the lidar and car reference points, as well as a populated /tf message which was created using data from the IMU. Lane Tracking: used Dominic’s repository (https://gitlab.com/djnighti/ucsd_robo_car_simple_ros/-/tree/master) The lane tracking identifies where the dotted white line is on the ground, and maintains the steering to follow the line. It computes the centroid which acts as the center line and is fed to the steering and throttle.

IMU publish odometry and conversion to tf https://gist.github.com/atotto/f2754f75bedb6ea56e3e0264ec405dcf

Obstacle Avoidance The obstacle avoidance node will subscribe to the /scan topic created using the rplidar package and adjust steering according to a few rules. The current trajectory is determined by the lane guidance script, which follows the dotted line. When the lidar scan indicates there is an obstacle ahead, the steering angle published will adjust to a new trajectory that will avoid this obstacle, until there is no obstacle. Then, the trajectory will guide back to undo the path change and return to the steering provided by the lane guidance.

Schematic:

Initially we wanted to use AMCL which uses particle filtering for localization for better estimation but due to time constraints, the package was not used, resulting in the ROS schematic shown below.

Team1.6.png


RVIZ:

RVIZ is a program useful in visualizing data streams in ROS nodes. For example, once set up, while SLAM is being run, the resulting map can be visualized in the program.

When running the car, the Jetson Nano is not connected to a display, so to visualize the data streams, a Virtual Network Computing (VCN) viewer was tried used, which acts similar to a remote desktop system, however, NoMachine was used because it was simple to setup and get working. Once installed, to get RVIZ running on the Jetson Nano to display the LaserScan message, the following was run:

Install NoMachine to vm

Launch NoMachine, configure it to connect to Jetson using ip. Ssh into Jetson Nano with displays on the vm ie: ssh -Y jetson@ip…

   To re-enable GUI issue the command:
   sudo systemctl set-default graphical.target
   or to start GUI session on a system without a current GUI just execute:
   sudo systemctl start gdm3.service
   To launch lidar visualization
   roslaunch rplidar_ros view_rplidar.launch
   To stop the GUI
   sudo systemctl stop gdm3.service

More info on NoMachine Virtual Desktop for the Jetson at: Linux Download Jetson ARM Download


Results:

Steering

RVIZ

Future plans/improvements:

Include AMCL (http://wiki.ros.org/amcl) to improve localization accuracy through sensor fusion in estimation Better mounting for a more efficient weight distribution

Other documentation

Slides

Simple Git Guide

ROS Dependencies (Github)