Imitation learning provides a promising approach to dexterous hand manipulation, but its effectiveness is limited by the lack of large-scale, high-fidelity data. Existing data-collection pipelines suffer from inaccurate motion retargeting, low data-collection efficiency, and missing high-resolution fingertip tactile sensing. We address this gap with MILE, a mechanically isomorphic teleoperation and data-collection system co-designed from human hand to exoskeleton to robotic hand. The exoskeleton is anthropometrically derived from the human hand, and the robotic hand preserves one-to-one joint-position isomorphism, eliminating nonlinear retargeting and enabling precise, natural control. The exoskeleton achieves a multi-joint mean absolute angular error below one degree, while the robotic hand integrates compact fingertip visuotactile modules that provide high-resolution tactile observations. Built on this retargeting-free interface, we teleoperate complex, contact-rich in-hand manipulation and efficiently collect a multimodal dataset comprising high-resolution fingertip visuotactile signals, RGB-D images, and joint positions. The teleoperation pipeline achieves a mean success rate improvement of 64%. Incorporating fingertip tactile observations further increases the success rate by an average of 25% over the vision-only baseline, validating the fidelity and utility of the dataset. Further details are available at: https://sites.google.com/view/mile-system.
翻译:模仿学习为灵巧手部操作提供了一种有前景的途径,但其有效性受限于缺乏大规模、高保真度的数据。现有的数据采集流程存在运动重定向不准确、数据采集效率低以及缺少高分辨率指尖触觉感知的问题。我们通过MILE系统来解决这一差距,这是一个从人手到外骨骼再到机器人手的协同设计的机械同构遥操作与数据采集系统。该外骨骼基于人手的人体测量学特征设计,而机器人手保持了一对一的关节位置同构性,消除了非线性重定向,实现了精确、自然的控制。外骨骼实现了多关节平均绝对角度误差低于一度,同时机器人手集成了紧凑的指尖视觉触觉模块,提供高分辨率的触觉观测。基于这种无需重定向的接口,我们遥操作复杂、接触密集的手内操作,并高效采集了一个多模态数据集,包含高分辨率指尖视觉触觉信号、RGB-D图像和关节位置。该遥操作流程的平均成功率提升了64%。结合指尖触觉观测,相较于仅使用视觉的基线,成功率平均进一步提高了25%,验证了数据集的保真度和实用性。更多详细信息请访问:https://sites.google.com/view/mile-system。