From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to: navigation, search

The goal of this project is to expand upon the previous quarter's group, Project GPS. We wish to increase the accuracy and reduce the variance so that the car will drive a straight path to a given waypoint. Obstacle avoidance will also be implemented via ultrasonic sensors.

Team Members

Spring 2018

  • Simon Fong, Electrical Engineering BS
  • Saurabh Kulkarni, Computer Science BS
  • Kevin Tan, Aerospace Engineering BS
  • Yingqiao Jiang, Mechanical Engineering BS

Project Links

The Idea

Following the build and training of the autonomous vehicle to drive laps around the track, Team 3 decided to pursue a project involving autonomous navigation. The goal was for the car to be able to drive to a given input trajectory. Utilizing a GPS system and an IMU, a control system was implemented to minimize the error between the car's angle and the angle to the coordinate it should be heading towards.



Raspberry Pi 3

  • PWM Controller (PCA9685) via I2C
  • GPS (GP-20U7) via Serial
  • Arduino Micro via Seruial

Arduino Micro

  • 2 Ultrasonics (HC-SR04) via Digital I/O


SP18 T03 full wiring.jpg


Building off of ProjectGPS, the gps.py and planner.py parts were added to the donkeycar system, with modifications made by the team.

GPS Test Script


This script can run and output a data.json that holds GPS points in JSON format the visualizer can show on a map.

GPS Visualizer




Install python web development framework.

pip install flask
Google Maps API
  1. Sign up for an account on Google Cloud Platform.
  2. Get API Key
  3. Change the key= in index.html
<script src="https://maps.googleapis.com/maps/api/js?key=AIzaSyB6JltnkeQh3hVYnLu1X4HB9yt5mZ8vTrg&callback=initMap"
   async defer></script>

Helpful tutorial here.


  1. Update data.json so that all the GPS points that you want to show up.
  2. Run the following the command below.
  3. Go to the device's IP address at port 3148 using a web browser.

Command for 2

python server.py

After 2, you should see this:

 * Serving Flask app "server" (lazy loading)
 * Environment: production
  WARNING: Do not use the development server in a production environment.
  Use a production WSGI server instead.
 * Debug mode: off
 * Running on (Press CTRL+C to quit)

Put in your web browser while on the same network as the device and replace with the device's IP address.


   "markers": [
           "position": {
               "lat": 32.88126233333333, 
               "lng": -117.23551116666667
           "label": "Start: 0"
           "position": {
               "lat": 32.881258833333334, 
               "lng": -117.23551266666666
           "label": "1"


The IMU given utilized the InvenSense MPU 92/65 9-axis sensor. The IMU gave readings for the acceleration, magnetic field, and rate gyros for the x, y, and z directions, and came equipped with a Digital Motion Processor (DMP) to calculate motion algorithms that filter variation from noise and magnetic drift.

We found a C library that uses the DMP in the 92/65 sensor to read the data from the IMU to the Raspberry Pi.

A Python library was provided by Professor de Oliveira to help us integrate that functionality into the donkeycar framework.

Unfortunately, the problem with this setup was that the heading would drift far too quickly for the absolute angle reading to be useful for intervals of time longer than a few seconds. This could be because the library had no calibration files for the magnetometer nor the gyroscope. We were not able to figure out to create these calibration files, but we believe it has something to do with the calibration scripts in the rc_mpu C library. You use the 'make' command, which will generate executables and store them in the 'bin/' directory. That was all we could figure out.

Professor de Oliveira told us we could potentially use the IMU for short periods of time and use the GPS to determine the correct heading between GPS readings, but we think it's far better to simply draw straight lines between GPS readings.

Ultrasonics Object Detection

Another improvement to the ProjectGPS frame is the utilization of ultrasonic sensors for the purpose of object avoidance. The team was able to achieve successful incorporation of the ultrasonic sensors as a donkeycar part.

We used the HC-SR04 sensor for this. We used two of these; initially, both were faced parallel to the ground but a few degrees off the line of symmetry of the car. However, this setup was not able to detect obstacles that were very narrow at the level of the car - for example, human legs. Thus, we placed both sensors along the line of symmetry, with one facing low for obstacles such as curbs, and the other facing up looking for obstacles like people.

We also implemented a basic low-pass filter to smoothen the potentially noisy data that the ultrasonics would sometimes get due to moving obstacles.

new_distance = WEIGHT*old_distance + (1-WEIGHT)*new_reading

As WEIGHT goes up, the "filtering" becomes stronger and the data becomes less sensitive to sudden and unsustained changes in the reading. We turned this up to 0.3 because the car was quite slow and had no need to be able to react quickly. You may want to turn this down if you decide to have the car drive faster.

Here is a video demo of the ultrasonic sensors on our car:


From multiple attempts of correction and filtering, the car was still unable to perform as desired when incorporating data from the IMU. The overall consensus reached by Team03 is that the utilization of the MPU9250 only worsened the performance of the GPS navigation.

Possible Improvements

  • Buying a better IMU such as the LSM9DS1
  • Implement better filtering algorithms
  • Incorporate Computer Vision