In this paper, we challenge the estimation of contact forces backed with ground-truth sensing in human whole-body interaction with the environment, from motion capture only. Our novel method makes it possible to get rid of cumbersome force sensors in monitoring multi-contact motion together with force data. This problem is very challenging. Indeed, while a given force distribution uniquely determines the resulting kinematics, the converse is generally not true in multi-contact. In such scenarios, physics-based optimization alone may only capture force distributions that are physically compatible with a given motion rather than the actual forces being applied. We address this indeterminacy by collecting a large-scale dataset on whole-body motion and contact forces humans apply in multi-contact scenarios. We then train recurrent neural networks on real human force distribution patterns and complement them with a second-order cone program ensuring the physical validity of the predictions. Extensive validation on challenging dynamic and multi-contact scenarios shows that the method we propose can outperform physical force sensing both in terms of accuracy and usability.

Reference

[pdf]

@inproceedings{sii:pham:2016,
  TITLE = {Whole-Body Contact Force Sensing From Motion Capture},
  AUTHOR = {Pham, Tu-Hoa and Bufort, Adrien and Caron, St{\'e}phane and Kheddar, Abderrahmane},
  URL = {https://hal.archives-ouvertes.fr/hal-01372531},
  BOOKTITLE = {IEEE/SICE International Symposium on System Integration},
  ADDRESS = {Sapporo, Japan},
  YEAR = {2016},
  MONTH = Dec,
}