A System for Learning Continuous Human-Robot Interactions from Human-Human Demonstrations
David Vogt, Simon Stepputtis, Steve Grehl, Bernhard Jung, Heni Ben Amor
IEEE International Conference on Robotics and Automation (ICRA), 2017
Conference Paper

We present a data-driven imitation learning system for learning human-robot interactions from human-human demonstrations. During training, the movements of two interaction partners are recorded through motion capture and an interaction model is learned. At runtime, the interaction model is used to continuously adapt the robot’s motion, both spatially and temporally, to the movements of the human interaction partner. We show the effectiveness of the approach on complex, sequential tasks by presenting two applications involving collaborative human-robot assembly. Experiments with varied object hand-over positions and task execution speeds confirm the capabilities for spatio-temporal adaption of the demonstrated behavior to the current situation.

title={A system for learning continuous human-robot interactions from human-human demonstrations},
author={Vogt, David and Stepputtis, Simon and Grehl, Steve and Jung, Bernhard and Amor, Heni Ben},
booktitle={2017 IEEE International Conference on Robotics and Automation (ICRA)},