SEEKER | UCSD ROBOCAR
MAE/ECE148 Team1 | Winter 2022
Our robocar, the Seeker, is designed to be able to search for, locate, navigate to, and collect red ping pong balls.
- Parker Knopf (MAE)
- Jacob Bingham (MAE)
- Moises Lopez (ECE)
- Guy Shabtai (ECE)
This robot was designed to seek and pick up red ping pong balls scattered around an environment
The robocar will locate the ping pong balls with an RGBD camera. It will then drive to the ball to perform a recovery maneuver using the webcam. It will be equipped with a suction tube to pick up the ping pong balls.
- Locate ping pong balls in an open and unobstructed environment
- Pick up ping pong balls
- Use Lidar for collision avoidance
- Make decisions on what ping pong balls to pick up first
- All elements of Phase 1
- Locate ping pong balls with obstructions where they must be sought out by navigating around the environment
- ROS1 SLAM
- Extensive use of Lidar
- This project is to be run using linux on a Jetson Nano
- Download Docker: https://hub.docker.com/r/djnighti/ucsd_robocar
docker pull djnighti/ucsd_robocar
- Download Repository: https://github.com/gshabtai/ece148-team1.git
- Start Docker
$ # Attach to Docker
$ cd src
$ git clone https://github.com/gshabtai/ece148-team1.git
$ cd ..
- Configure the Robot using the schematic
- Boot up the Jetson Nano and connect through SSH
- Start the docker
<source lang="bash">$ # Start Docker</source>
- Source Ros with docker integrated command
<source lang="bash">$ # source_ros2</source>
- Run the program
<source lang="bash">$ ./src/ece148-team1/seeker.sh</source>
This project was hardware intensive as it required a significant amout of additional components to capture and store ping pong balls. In addition to the adustable, intel camera mount, an intake system was designed and fabricated. This intake system was fitted with 2 highspeed fans, creating avacum that would suck the pingpong balls up into the basket.
An image of the intake system can be seen here:
All CAD was done using solidworks. A snipit of the intake system CAD can be seen here:
Intel Camera Mount
The intel-camera mount CAD can be seen here:
This projected added an additional 3 components of electronic hardware to the provided MAE/ECE 148 kit. In addition to the kit we had a relay, and 2 high speed fans. A schematic for our project can be seen here:
Perhaps the most time intensive asscpet of the project was the programing. With several added nodes and topics this project was a display of enginuity. The Github for this project can be found here: https://github.com/gshabtai/ece148-team1.git
- Nodes subscribe to topic ‘/state’.
- Nodes only allow to control navigation if on their respective state.
- This model is great for encapsulating robot behavior based on external factors
- Descrition: Stops all actuator output
- Activated: When ball basked is full
- Importance: System starts on idle to calibrate the ESC value 0 as topic publish 0
- Description: Turn left on an loop until there is a ball seem by either the RGBD camera or the webcam.
- Activated: “Default” state when no ball is seem or no collition ahead has been detected.
- Description: Reverses the car backwards for a period of two seconds
- Activated: The state is call when a ball is lost in the align state or capture state, and during collision avoidance.
- Uses the Intel camera for wider range of view
- Subscribes to depth and rgb of intel camera
- Uses PID controller to align abll to the right of robot for capture. IOW, it gives us a centroid offset so that the ball will be directed towards the intake system rather than the front of the car.
- Uses webcam to align ball with intake system.
- Uses PID controller to align ball to the center of the funner for ping pong ball collection.
Dynamic Centering Control PID Controller
- Designed and provided by Dominic (Our Lord and Savior)
CV Centroid Topic Nodes
- Used by align and capture
Computer Vision Ball Detection
How we found the centroid, an overview.
- Computer get image from webcam and converts it HSV color space.
- Filter with HSV range, removes noise, and picks biggest blob on the resulting image.
- A moment search is done on the image to find the centroid of the blob. Which is then remap from -50-to-50 horizontally that is used for steering.
- Intake system subscribes to centroid topic.
- If a ball is detected within the lower region of the camera frame. The fans will turn on.
- The system then will update the number of balls that have been loaded onto the system.
Collision Avoidance State | Using LIDAR
- Car uses LIDAR to detect and avoid objects.
- Half-circle LIDAR shape to detect stationary objects
- A maximum distance (Ro) is set by the user. The car will ignore all objects outside of this radius, and then turn to avoid any object closer than this radius.
- Inner radius exists to avoid interference by car parts within range of the LIDAR.
Collision Avoidance State | Using SLAM
- Another method for navigating an environment with obstacles
- STEP 1 Use LIDAR to create a map of the environment.
- STEP 2 Use a navigation stack to issue goal commands (x1,y1) -> (x2,y2)
- Navigation stack would be similar to the TurtleBot from the Navigation Workshop
- Twist command for angular-z velocity (i.e. theta) would be limited to robocar’s physical constraints. This is “good enough”.
- Alternative: develop a navigation stack for Ackermann steering. Control theory stuff.