From MAE/ECE 148 - Introduction to Autonomous Vehicles
Jump to: navigation, search


The goal of our project is to follow a person using IR sensors. Under real world circumstances, the concept of our project can be applied to indoor self-following shopping cart system for supermarkets. The car can follow a person who has an IR LED patch attached at a certain range.

Fully installed robocar

Team Members

Jose Manuel Rodriguez - ME

Yuan Chen - ME

Jason Nan - BENG: BSYS

Po Hsiang Huang - EE


Plate and Camera Mount
The camera is designed to be adjustable and can be easily locked in place with a screw and a nut. During our training, we need to find the optimal camera angle thus making the camera stand adjustable saved us a lot of time. The stand for the Structure Sensor is set to 30 degree of inclination since the camera itself has a wide range of view of 160 degrees and it needs to capture the IR patch that is attached to the back of a person. The Acrylic platform is printed with Laser-cut machine. Multiple mounting holes are cut for variety of mounting hardware choices if applicable.

3D printed camera mount for the pi-camera
3D printed camera mount for Structure Sensor
3D printed housing for Raspberry Pi
Circuit Diagram
Below is the circuit diagram for the robot car setup. A emergency stop circuit is used using a relay which can be remotely controlled. This adds extra safety during the test and training. Two LEDs are used to indicating the connection. Blue means circuit is connected and the car can be controlled by the controller, and red means circuit is cut-off.
Circuit diagram for autonomous driving. Drawn using Fritzing
LED patch is consisted of 25 IR LEDs
Structure Sensor

Autonomous Driving

Indoor Circuit
Here: is a video of 7 laps of indoor driving
Outdoor Circuit
Here is a video of 3 laps of outdoor driving


Target Following using Structure Sensor
We will be using the IR sensor form the structure sensor to detect the IR LED patch that is attached to the back of the targeted following object, using the modified donkey training program, we can train the car the follow the LED patch within a distance of 1m.
IR patch setup
Left: Depth view; Right: Pure IR Image
Neural Network
Unlike the name suggests, the neuron network in our autonomous driving training functions very differently from our brains. The core idea is the visual pattern recognition. By learning from tons of examples that are presented by us to teach the program what actions are taken under what circumstances. Under large amount of training, the program will set up a data base large enough to somewhat 'understand' what to do under certain circumstances. In this class, by using camera to recognizing the track lines, the donkey frame work will generate throttle and steering output in autonomous driving based on our inputs during training.
IR Patch
The IR LED patch is connected using the following circuit diagram, allowing maximum current allowed to flow through LEDs to get the maximum brightness.
Circuit diagram for LED patch
Issues with dark IR images
Although we tried to power the LEDs to the maximum brightness, structure sensor still struggles to pick up any clear and distinctive images of the patch even indoors due to the limitation of the physical range of 1 foot. A much stronger IR patch is needed for better image capture on the structure sensor side.
Very faint and weak recognition for the IR patch
Issues with OpenNI2
OpenNI2 is pretty finicky. DO NOT USE THE PACKAGE FROM STRUCTURE SENSOR PAGE! Go to the openni2 github and compile it locally on your pi. Basically git clone it and there will be a folder called OPENNI2. Go to that folder and type "make" in the command line.

If you run into this error: /usr/include/arm-linux-gnueabihf/gnu/stubs.h:7:29: fatal error: gnu/stubs-soft.h: No such file or directory include <gnu/stubs-soft.h>


type make -mfloat-abi=hard and it should fix it.

Here is the weird part of OPENNI2, you have to manually change the config files called "PSLINK.ini" and "PS1080.ini". Find these files in OPENNI2,path .../OpenNI2/Config/OpenNI2/Drivers$

You want to change the line: ";UsbInterface=2" to "UsbInterface=0" *Note that you are removing the semi colon. This is on purpose.

Before you try working on it on the RPI, you should download the driver onto your personal laptop to see the camera in action. You also have to change the two config files.

Opencv and openni on python
These are libraries that you have to install to use the structure sensor. Openni can be installed from git install openni and should not cause many problems.

Integrate to Donkey Framework
2018 fall team3 did this integration. You can get the part and altered manage.py file from their team page.
We are able to acquire pictures and data from PC successfully however we are unable to integrate the structure sensor to the Raspberry due to install error of OpenNI2. Another issue is that our IR LED patch is not strong enough for the structure sensor to pickup clearly. We encountered lots of set backs and unexpected difficulties during this course. Things that we would like to recommend future groups who intend to work on similar project that do not drive under sunny hot weather as your micro SD card might melt under such condition. If possible, try to find other IR sensor. Although structure sensor looks cool and is quite expensive, the software it uses is quite outdated. OpenNI2 was used generally in around 2012-2016, which could cost teams to spend a lot of time researching and trouble shooting without any good solutions.