Skip to content

UCSD-ECEMAE-148/148-winter-2025-final-project-team-2

Repository files navigation

UCSD ECEMAE148 Team2 FinalProject

This is the github repo for Team2, we mainly focused on recovering Ublox GPS in ROS2.

In the original github repository from ucsd_robocar_hub2, but it dosen't have the functionality of running gps inside the docker container(ROS2). Therefore, we implemented the node for ublox gps from scratch to work with the existing files for the robot to follow gps coordinates. It would be useful for future classes. All the code files are inside /ucsd_robocar_hub2.

Pedestrian Avoidence Robot Car/GPS on ROS2 Docker

:

ECEMAE148 Final Project

Team 2 Winter 2025

Table of Contents
  1. Team Members
  2. Final Project
  3. Robot Design
  4. Acknowledgments
  5. Authors
  6. Contact

Team Members

Andrew N, Daphne, Jose, and Rodolfo

Team Member Major and Class

  • Andrew N - Electrical Engineering, Curcuits and Systems - Class of 2026
  • Daphne - Electrical Engineering, Curcuits and Systems - Class of 2026
  • Jose - Biology Engineering
  • Rodolfo - Mechanical Engineering

Final Project

Original Goals

Originally, we proposed to create a robot that can follow the gps lap while avoid pedestrian and remappng in real time. We also promised to use ROS2, Oakd-lite camera, ublox gps, and jetson nano(all the tools we already had since the begining of the quarter). To do so, we need to run the camera detection of pedestrian in ROS2 and find a way to send message to the gps and modify the path if needed, based on the detection.

Goals We Met

The gps was setup inside the donkey environment, which is outside of ROS2(docker container). We struggled to make ROS2 communicate with the gps outside of the docker environment especially doing rerouting of the gps lap when a pedestrian is detected. Therefore, we decidced to make gps works inside ROS2 instead of in the donkey environment, and this has become our top priority since it would be a lot easier to use for the future classes. To do so, we figured the general workflow of the packages and nodes provided inside docker contianer, and we ended up pulling the original github repository from ucsd_robocar_hub2 to get necessary nodes. We found that it was missing the node of reading from gps, so we implemented the node for ublox gps from scratch called "ublox_gps_node.py" under the directory ucsd_robocar_hub2/ucsd_robocar_sensor2_pkg/ucsd_robocar_sensor2_pkg/. We did a series of testing of the gps node to work with the existing files for the robot to follow gps coordinates. The overall workflow detail will be discussed in the Final Project Documentation sectoion.

On the camera side, we self trained a model on roboflow to detect pedestrian by detecting any "foot" appearing in front of the oakd camera. This would be easier for camera to see if there is any people right in front of the camera instead of looking from a long distance to record a whole person. We created a package in ROS2 for the camera called "oakd_node.py" under ucsd_robocar_hub2/oakd_ros2/oakd_ros2/ to send detection message of "left", "right", or "none" to indicate which direction the pedestrian is moving, so the robocar can change its route accordingly. The picture below shows an example of the detection message being sent(also included in the presentation slides).

At the end, we didn't show the completed demo of robot car avoiding pedestrian while following the lap, but we have every parts(camera and gps) completed separately.

If We Have Another Week...

Stretch Goal 1

Unfortunately we couldn't finish combining every part together for the robot car to avoid pedestrian while following the GPS path as we promised originally, but we have the GPS node and camera node completed in ROS2, we just need to make some changes to "gps_path_provider_node.py" under ucsd_robocar_hub2/ucsd_robocar_path2_pkg/ucsd_robocar_path2_pkg / to activate conditions of path re-routing based on the "oakd_node.py" which publishes the detection data, so the robot car can turn either left or right and then go back into the original path, we would need to have a separate PID for this as well.

Stretch Goal 2

We would like to implement self parking using lidar and computer vision so the robot car can correct itself based on the space lines using the camera.

Final Project Documentation

Robot Design CAD

Open Source Parts

Part CAD Model Source
Jetson Nano Case Thingiverse

Software

Embedded Systems

The system is running based on jetson nano and oakd-lite camera.

ROS2

In this section we will specifically show the steps to set-up GPS in ROS2.

To get started, click into /ucsd_robocar_hub2 folder and you can see a lot of packages listed. Below is the main flow of the nodes interect with each other when we tested the gps:

How to Run

Use the UCSD Robocar Docker images. Python3 is required, you might need to install other dependencies if needed.

Step 1: Once you setup your docker container, open a terminal, go into the docker container.

docker start name_of_your_container

docker exec -it name_of_your_container bash

source_ros2

Clone this repository into ros2_ws.

git clone https://github.com/UCSD-ECEMAE-148/148-winter-2025-final-project-team-2.git

build_ros2

Step 2: Then, you can have a quick start to run the robot car at the EBU courtyard!

Since this is an extension of the github repository from ucsd_robocar_hub2, the default path file on the repository is a small circle inside the courtyard of EBU building. This is the image showing the default path being used:

To start your robot car to follow the path, simply run:

ros2 launch ucsd_robocar_nav2_pkg all_nodes.launch.py

Step 3: If you would like to change the gps lap for your robot car to run on, you can put your recorded .csv file inside /home/projects/ros2_ws/ebu2_courtyard/ebu2_courtyard_man_2.csv

cd /home/projects/ros2_ws/ebu2_courtyard

Copy paste your own file here:

nano ebu2_courtyard_man_2.csv

Notice that your .csv file has to have the format of "lat,lon,alt" as the column name, for example:

lat, lon, alt

37.7749,-122.4194,30.0

34.0522,-118.2437,100.5

40.7128,-74.0060,50.2

Step 5: Finally, to get the car running on your custom path:

source_ros2

build_ros2

ros2 launch ucsd_robocar_nav2_pkg all_nodes.launch.py

Youtube link of our robot following EBU courtyard(small circle in the middle): Demo Video of car running on GPS

Slides are also published onto this repository.

Authors

Andrew N, Daphne, Jose, and Rodolfo

Acknowledgments

Big thanks to Professor Jack Silberman, our TA Alexander Haken and Winston Chou, you guys are super amazing and helpful! Thank you Alexander for the readme template.

Contact

About

148-winter-2025-final-project-team-2 created by GitHub Classroom

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published