Giudici, Gabriele, et al. "Feeling good: Validation of bilateral tactile telemanipulation for a dexterous robot." Annual Conference Towards Autonomous Robotic Systems. Cham: Springer Nature Switzerland, 2023. [here].
Effective execution of long-horizon tasks with dexterous robotic hands remains a significant challenge in real-world problems. While learning from human demonstrations have shown encouraging results, they require extensive data collection for training. Hence, decomposing long-horizon tasks into reusable primitive skills is a more efficient approach. To achieve so, we developed DexSkills, a novel supervised learning framework that addresses long-horizon dexterous manipulation tasks using primitive skills. DexSkills is trained to recognize and replicate a select set of skills using human demonstration data, which can then segment a demonstrated long-horizon dexterous manipulation task into a sequence of primitive skills to achieve one-shot execution by the robot directly. Significantly, DexSkills operates solely on proprioceptive and tactile data, i.e., haptic data. Our real-world robotic experiments show that DexSkills can accurately segment skills, thereby enabling autonomous robot execution of a diverse range of tasks.
@article{mao2024dexskills,
title={DexSkills: Skill Segmentation Using Haptic Data for Learning Autonomous Long-Horizon Robotic Manipulation Tasks},
author={Mao, Xiaofeng and Giudici, Gabriele and Coppola, Claudio and Althoefer, Kaspar and Farkhatdinov, Ildar and Li, Zhibin and Jamone, Lorenzo},
journal={arXiv preprint arXiv:2405.03476},
year={2024}
}