The goal of our project is to follow a person using IR sensors. Under real world circumstances, the concept of our project can be applied to indoor self-following shopping cart system for supermarkets. The car can follow a person who has an IR LED patch attached at a certain range.
Jose Manuel Rodriguez - ME
Yuan Chen - ME
Jason Nan - BENG: BSYS
Po Hsiang Huang - EE
- Plate and Camera Mount
- The camera is designed to be adjustable and can be easily locked in place with a screw and a nut. During our training, we need to find the optimal camera angle thus making the camera stand adjustable saved us a lot of time. The stand for the Structure Sensor is set to 30 degree of inclination since the camera itself has a wide range of view of 160 degrees and it needs to capture the IR patch that is attached to the back of a person. The Acrylic platform is printed with Laser-cut machine. Multiple mounting holes are cut for variety of mounting hardware choices if applicable.
- Circuit Diagram
- Below is the circuit diagram for the robot car setup. A emergency stop circuit is used using a relay which can be remotely controlled. This adds extra safety during the test and training. Two LEDs are used to indicating the connection. Blue means circuit is connected and the car can be controlled by the controller, and red means circuit is cut-off.
- Indoor Circuit
- Here: is a video of 7 laps of indoor driving
- Outdoor Circuit
- Here is a video of 3 laps of outdoor driving
- Target Following using Structure Sensor
- We will be using the IR sensor form the structure sensor to detect the IR LED patch that is attached to the back of the targeted following object, using the modified donkey training program, we can train the car the follow the LED patch within a distance of 1m.
- Neural Network
- Unlike the name suggests, the neuron network in our autonomous driving training functions very differently from our brains. The core idea is the visual pattern recognition. By learning from tons of examples that are presented by us to teach the program what actions are taken under what circumstances. Under large amount of training, the program will set up a data base large enough to somewhat 'understand' what to do under certain circumstances. In this class, by using camera to recognizing the track lines, the donkey frame work will generate throttle and steering output in autonomous driving based on our inputs during training.
- IR Patch
- The IR LED patch is connected using the following circuit diagram, allowing maximum current allowed to flow through LEDs to get the maximum brightness.
- Issues with dark IR images
- Although we tried to power the LEDs to the maximum brightness, structure sensor still struggles to pick up any clear and distinctive images of the patch even indoors due to the limitation of the physical range of 1 foot. A much stronger IR patch is needed for better image capture on the structure sensor side.
- Issues with OpenNI2
- Integrate to Donkey Framework
- We are able to acquire pictures and data from PC successfully however we are unable to integrate the structure sensor to the Raspberry due to install error of OpenNI2. Another issue is that our IR LED patch is not strong enough for the structure sensor to pickup clearly. We encountered lots of set backs and unexpected difficulties during this course. Things that we would like to recommend future groups who intend to work on similar project that do not drive under sunny hot weather as your micro SD card might melt under such condition. If possible, try to find other IR sensor. Although structure sensor looks cool and is quite expensive, the software it uses is quite outdated. OpenNI2 was used generally in around 2012-2016, which could cost teams to spend a lot of time researching and trouble shooting without any good solutions.