Imitation Learning of Robot Policies by Combining Language, Vision and Demonstration

Year
2019
Author(s)
Simon Stepputtis, Joseph Campbell, Mariano Phielipp, Chitta Baral, Heni Ben Amor
Source
NeurIPS 2019 Workshop on Robot Learning
BibTeX
BibTeX

In this work, we present a novel approach to generate robot policies from a visual environment perception along with a natural language task description and kinesthetic demonstration. This allows us to provide an intuitive and quick way of conveying tasks to the robot by creating a joint embedding. In a further step, these embeddings are translated into context specific low-level policies, allowing the robot to generalize towards different tasks without having to utilize different controllers for each task.

Leave a Reply

Your email address will not be published. Required fields are marked *