2021WinterTeam4

From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to navigation Jump to search
Team 4 Final Robot

The project encompassed multiple aspects during the course. The first 6 weeks of the class focused on incorporating the DonkeyCar AI framework into a Nvidia Jetson Nano development board. The second half of the class focused on running the vehicle using a simple ROS package while also integrating a SparkFun OpenLog Artemis IMU (https://www.sparkfun.com/products/16832) into the system to perform position estimation using a simple integration scheme.

Team Members

- Layne Clemen (UCSD Extension) - Ayush Gagger (MAE)
- Daniel Lopez Villa (MAE) - Zack Liu (ECE)

Our team consisted of 2 mechanical engineering students (Ayush and Daniel), 1 electrical engineer (Zack) and 1 UCSD extension student (Layne).




BasePlateTeam4.PNG

Mechanical Design

Team 6 Camera Mount

The camera mounting system was designed with flexibility and calibration in mind. Because the effectiveness of autonomous driving is largely dependent on the quality of image data, the orientation of the camera becomes extremely important. In order to efficiently manufacture the assembly, the design was broken down into smaller more manageable components to reduce printing time and complexity. The three main components of the mount are the angle brackets, the struts, and the mounting block. The modular design of the assembly allowed for 8 variable angle orientations as well as 10 vertical heights which proved effective when troubleshooting the optimal configuration.

The camera mount attaches to the front mounting block of the car via 2 4mm bolts, providing a universal design independent from the baseplate and electronic configuration. This aspect of universal mounting proved useful to Team 6 in particular, who utilized these files to streamline their own camera mounting assembly. It is possible for teams to reuse this design without any interference with the main build of the car.







Circuit Schematics

Schematic 148-04.png

The system consists of the following components:

  • Jetson Nano Developer Kit
  • Adafruit PWM controller
  • USB webcam
  • LED indicator board
  • Battery Pack
  • Battery voltage sensor
  • Remote relay (EMO)
  • 5V DC-DC voltage regulator
  • Switch




Algorithm Development

The algorithm follows simple Euler integration of a a planar rigid body system.

INSERT BICYCLE CAR DIAGRAM

From the IMU the following signals are available:

  1. Longitudinal Acceleration
  2. Lateral Acceleration
  3. Vertical Acceleration
  4. Roll Rate
  5. Pitch Rate
  6. Yaw Rate
  7. Longitudinal Magnetic Field
  8. Lateral Magnetic Field
  9. Vertical Magnetic Field

These signals are fused in the firmware developed by Francis Le Bars to output the heading of the orientation of the vehicle in quaternion form. The original idea was to use the heading output from the Artemis to calculate x- and y- velocities relative to the starting position of the vehicle. These speeds can then be integrated to determine an estimate of the position of the vehicle. It is important to take into account that the accelerations measured by the IMU are of the rotating frame and include the rotational components as well as the lateral and longitudinal accelerations. After struggling with the heading output from the firmware, it was determined that for some reason the heading was wrong. The team did not have the time to determine if it was due to improper calibration or a broken sensor. To counteract this, it was attempted to directly integrate the yaw rate to obtain the yaw angle of the vehicle. Further, during testing it was determined that low-pass filtering the biased-acceleration was required. There are many more updates to be done but the final algorithm for the project is below:

  1. Measure the first 1-second of data and calculate the averaged bias
  2. At each time step, do the following
    1. a_long_filtered(k+1) = k1*(a_long - bias_long) + k2*a_long_filtered(k)
    2. a_lat_filtered(k+1) = k1*(a_lat - bias_long) + k2*a_lat_filtered(k)
    3. U_dot = a_long_filtered + V*omega_yaw
    4. V_dot = a_lat_filtered - U*omega_yaw
    5. U(k+1) = U(k) + U_dot*Delta_t
    6. V(k+1) = V(k) + V_dot*Delta_t
    7. Deadband U and V (to actually stop the car)
    8. yaw(k+1) = yaw(k) + yaw_rate*Delta_t
    9. Rotate U and V into x_dot and y_dot
      1. x_dot = U*cos(yaw) - V*sin(yaw)
      2. y_dot -U*sin(yaw) + V*cos(yaw)
    10. x(k+1) = x(k) + x_dot*Delta_t
    11. y(k+1) = y(k) + y_dot*Delta_t

Where k1 and k2 are calculated from a basic discrete first-order low-pass filter with a break frequency of 5 rad/s.

Python Code

Please see the python code, as well as all our final project code, at our GitHub profile here: https://github.com/lclemen/MAE-148-team-4


Demo

Future Recommendations

The position estimation problem is a difficult one that needs additional considerations beyond the scope of this quarter. This project is the first step that can be utilized by future teams. A few ideas for options to investigate include:

  • AHRS issues with internal firmware
  • Additional filters on the signals
  • Fusing the algorithm with an encoder to perform dead reckoning
  • Implementing a Kalman filter instead the simple integration scheme.

Resources