[ICRA2025] Integrates the vision, touch, and common-sense information of foundational models, customized to the agent's perceptual needs.
-
Updated
Mar 19, 2025 - Python
[ICRA2025] Integrates the vision, touch, and common-sense information of foundational models, customized to the agent's perceptual needs.
Visuo-tactile dataset with GelSight and depth camera for YCB objects.
An official implementation of Touch100k: A Large-Scale Touch-Language-Vision Dataset for Touch-Centric Multimodal Representation
Tactile perception dataset, comprising of the DIGIT sliding over YCB objects with ground-truth pose.
Object shape exploration pybullet simulator for shape and pose recovery work (in progress)
Add a description, image, and links to the tactile-perception topic page so that developers can more easily learn about it.
To associate your repository with the tactile-perception topic, visit your repo's landing page and select "manage topics."