In this article, we propose a deep learning framework that provides a unified approach to the problem of leg contact detection in humanoid robot walking gaits. Our formulation accomplishes to accurately and robustly estimate the contact state probability for each leg (i.e., stable or slip/no contact). The proposed framework employs solely proprioceptive sensing and although it relies on simulated ground-truth contact data for the classification process, we demonstrate that it generalizes across varying friction surfaces and different legged robotic platforms and, at the same time, is readily transferred from simulation to practice. The framework is quantitatively and qualitatively assessed in simulation via the use of ground-truth contact data and is contrasted against state of-the-art methods with an ATLAS, a NAO, and a TALOS humanoid robot. Furthermore, its efficacy is demonstrated in base estimation with a real TALOS humanoid. To reinforce further research endeavors, our implementation is offered as an open-source ROS/Python package, coined Legged Contact Detection (LCD).
翻译:在本篇文章中,我们提出了一个深层次的学习框架,为在人形机器人走路时发现腿接触问题提供一个统一的方法。我们的配方能够准确和有力地估计每条腿(即稳定或滑/无接触)的接触状态概率。拟议框架仅使用自动感知,尽管它依赖模拟地面-真实接触数据进行分类过程,但我们表明它广泛分布于不同的摩擦表面和不同的脚形机器人平台,同时,它很容易从模拟转移到实践。框架通过使用地面-真相接触数据在模拟中进行定量和定性评估,并与ATLAS、NAO和TALOS类人形机器人的先进方法作对比。此外,它的效力表现在基础估计中,用真正的TALOS人形模型进行。为了进一步加强研究工作,我们作为开放源 ROS/ Python 包、 铸成的牵动接触探测(LCD) 来实施。