- Team Members
- Abstract
- What We Promised
- Accomplishments
- Challenges
- Final Project Videos
- Hardware
- Software
- Gantt Chart
- Course Deliverables
- Project Reproduction
- Acknowledgements
- MingWei Yeoh - ECE - 2025
- Trevor
- Minh Quach
- Jose
An realistic looking autonomous car that can activate lights when it detects a speeding car and autonomously pull it over or initiate a chase.
Past MAE projects haven’t tried to use car following for a real task, nor make their autonomous car LOOK like an actual car. Mechanically it will be a challenge to mount all the electronics inside of a small car body.
- LEDs
- Communication between LED controller and Jetson
- Body shell Mounting
- Chase the car
- Yolo running on OAK-D Camera
- Cool PCB
- Clean Electronics mounting
- A Siren
- Follows car
- LEDs work great
- 150A Soft start power switch works great
- Jetson not connecting to Wifi
- ROS2 not configured correctly to run our nodes
- I2C reaching bugged state
Link to the OnShape CAD
- Green - Body shell mounting method. It uses 8x3mm magnets.
- Purple - Reference geometry from the car
- Black - Structural 3D Printed pieces
- Orange - 3D Printed Spacers
The design is super clean and allows the body shell to easily cover all the electronics.
Additional Hardware Necessary
- Custom LED driver PCB
- Custom LED module
- 8x3mm Magnets
- 3mm bolts
- Body shell
- Embedded EPS32S3
- USB C Port
- Anti spark power switch capable of 150 A
- Buzzer (for siren)
- 2x High Powered led driver
- 3x WS2815 Led Controllers
- Communication with Jetson via I2C
- 3 Spare GPIOs (if needed)
- Onboard VBAT -> 3V3 Buck regulator
Power
Snippet from the schematic. 3 Low RDson PMOS' act like the switch. When the on button is pressed, the 4s Lipo gets fed into the onboard buck regulator to power the microcontroller.
Hand Assembled. Theres a lot of parts
I2C
ESP32 acts as a peripheral device with address 0x55. It takes in a 2 byte command. 0x00 and 0x11.
0x00 Turns on LEDs and 0x11 turns off LEDs.
A ROS2 package for autonomous vehicle pursuit using computer vision and DepthAI. This package enables a robocar to detect and follow another vehicle using visual tracking.
- Features
- Prerequisites
- Installation
- Nodes
- Topics
- Launch Files
- Parameter Tuning
- Troubleshooting
- Configuration Files
- Real-time car detection using YOLO and DepthAI
- Adaptive PID-based steering control
- Dynamic throttle management
- Real-time parameter tuning interface
- Parameter persistence across launches
- Robust tracking with lost-frame handling
- ROS2
- DepthAI
- OpenCV
- Python 3.8+
- Clone this repository into your ROS2 workspace:
cd ~/ros2_ws/src
git clone <repository_url> robocar_visual_pursuit_pkg
- Install dependencies:
cd ~/ros2_ws
rosdep install --from-paths src --ignore-src -r -y
- Build the package:
colcon build --packages-select robocar_visual_pursuit_pkg
The car detection node uses YOLO and DepthAI to detect vehicles in the camera feed and calculate their position relative to the robot.
Parameters:
confidence_threshold
(0.0-1.0): Confidence threshold for car detectioncamera_centerline
(0.0-1.0): Normalized position of camera centerlineerror_threshold
(0.0-1.0): Error threshold for steering controliou_threshold
(0.0-1.0): IOU threshold for object detectionmax_lost_frames
(0-30): Maximum number of frames to track lost object
Topics:
- Subscribes:
/camera/color/image_raw
(sensor_msgs/Image) - Publishes:
/centroid
(std_msgs/Float32)
The lane guidance node implements PID control for steering and adaptive throttle management based on tracking error.
Parameters:
Kp_steering
(0.0-5.0): Proportional gain for steering controlKi_steering
(0.0-2.0): Integral gain for steering controlKd_steering
(0.0-2.0): Derivative gain for steering controlmax_throttle
(0.0-1.0): Maximum throttle valuemin_throttle
(0.0-0.5): Minimum throttle value
Topics:
- Subscribes:
/centroid
(std_msgs/Float32) - Publishes:
/cmd_vel
(geometry_msgs/Twist)
A dedicated node for real-time parameter tuning with persistent storage.
Features:
- Real-time parameter adjustment via rqt_reconfigure
- Parameter persistence across launches
- Automatic parameter forwarding to relevant nodes
Topic | Type | Description |
---|---|---|
/camera/color/image_raw |
sensor_msgs/Image | Raw camera feed |
/centroid |
std_msgs/Float32 | Normalized target position (-1.0 to 1.0) |
/cmd_vel |
geometry_msgs/Twist | Vehicle control commands |
Launch the complete visual pursuit system:
ros2 launch robocar_visual_pursuit_pkg car_pursuit_launch.py
This launches:
- Car detection node
- Lane guidance node
- Parameter tuner node
- VESC interface node
Parameters are managed through ROS2's rqt dynamic reconfigure system and YAML configuration files. The default parameters are stored in config/default_params.yaml
.
camera_centerline
(float, default: 0.5): Normalized position of camera centerlineconfidence_threshold
(float, default: 0.2): Confidence threshold for car detectiondebug_cv
(int, default: 0): Enable/disable debug visualizationerror_threshold
(float, default: 0.5): Error threshold for steering controliou_threshold
(float, default: 0.5): IOU threshold for object detectionmax_lost_frames
(int, default: 30): Maximum number of frames to track lost objectmodel_path
(string): Path to YOLO model weightsuse_sim_time
(bool, default: false): Use simulation time
- Launch the visual pursuit node:
ros2 launch robocar_visual_pursuit_pkg visual_pursuit.launch.py
- Open rqt reconfigure:
ros2 run rqt_reconfigure rqt_reconfigure
-
Adjust parameters in real-time using the rqt interface
-
Export current parameters to YAML:
ros2 param dump /visual_pursuit_node > config/my_params.yaml
- Load custom parameters at launch:
ros2 launch robocar_visual_pursuit_pkg visual_pursuit.launch.py params_file:=config/my_params.yaml
Note: You may need to consolidate the yaml params from multiple files as params dump specific to their own node but all nodes read from default_params.yaml by default. This will be fixed in a future update
Common issues and solutions:
-
Parameter Changes Not Saving
- Verify rqt_reconfigure is running
- Check write permissions in the config directory
- Ensure the parameter dump command executed successfully
-
Node Not Starting
- Check if all required configuration files exist
- Verify file paths in launch files
- Ensure ROS2 environment is properly sourced
-
Detection Issues
- Verify camera connection and permissions
- Check model path in configuration
- Adjust confidence and IOU thresholds
For additional help, check the ROS2 logs:
ros2 log info /visual_pursuit_node
The package uses several configuration files in the config
directory:
default_params.yaml
: Default parameters for the visual pursuit systemcar_detection_node
: Configuration for car detection parameterslane_guidence
: Lane guidance system configurationros_racer_calibration.yaml
: Calibration parameters for the robocar
Each configuration file serves a specific purpose:
-
Visual Pursuit Parameters (
default_params.yaml
)- Core parameters for the visual pursuit system
- Modified through rqt_reconfigure
- Can be exported and loaded at runtime
-
Car Detection Configuration (
car_detection_node
)- YOLO model configuration
- Detection thresholds and parameters
- Camera settings
-
Lane Guidance Configuration (
lane_guidence
)- Lane detection parameters
- Steering control settings
- PID tuning values
-
Calibration Parameters (
ros_racer_calibration.yaml
)- Hardware-specific calibration
- Motor and servo settings
- Sensor calibration values
To modify these configurations:
- Use rqt_reconfigure for real-time parameter tuning
- Export modified parameters to YAML files
- Load specific configuration files at launch time
Example of loading multiple configuration files:
ros2 launch robocar_visual_pursuit_pkg visual_pursuit.launch.py \
params_file:=config/my_params.yaml \
calibration_file:=config/ros_racer_calibration.yaml
Here are our autonomous laps as part of our class deliverables:
-
DonkeyCar Reinforcement Laps: https://youtube.com/shorts/6msSXXdx4cQ?feature=share
-
GPS Laps: https://youtu.be/pAPebhBlGDo
-
Lane Following: https://youtu.be/QzcPgDxxi5c
-
All Slides: https://drive.google.com/drive/folders/1qeRSDj4K1PtiiYxJ_8OacECK8fxUN1bn?usp=sharing
Special thanks to Alex and Winston!