Towards Semantic Policies for Human-Robot Collaboration


As the application domain of robots moves closer to our daily lives, algorithms and methods are needed to ensure safe and meaningful human-machine interaction. Robots need to be able to understand human body movements, as well as the semantic meaning of these actions. However, implementing such capabilities that bridge low-level movement representations and high-level task abstractions by hand is a notoriously difficult challenge that is not properly supported by todays programming languages interfaces. To overcome this challenge, this research aims to create novel ways of teaching complex tasks to a robot by combining traditional learning-from-demonstration with natural language processing and semantic analysis.

In Southwest Robotics Symposium