The Robohow framework represents control programs as concurrent, percept-guided manipulation plans. It will use websites, visual instructions and haptic demonstration as primary information sources. These heterogeneous pieces of information will be integrated and combined with each other through an interface layer that provides an abstract machine for programming high-level robot manipulation plans. The interpreter for this abstract machine includes novel mechanisms for optimization and constraint-based movement specification, and percept-guided manipulation.
The framework is designed to cope with the many challenges of working in knowledge-intensive, open-world settings. Consider for example a robot that needs to know how to make a pancake. Existing resources that are widely available for performing human-scale activities, such as wikihow.com, will typically consist of vague, incomplete and ambiguous instructions. An instruction such as “stir the batter until the texture is smooth” does not specify what the batter needs to be stirred with, what exact trajectory one has to follow and with what force or speed. Instead, the robot must be able to decide for each task what the appropriate ways to perform appropriate actions on appropriate objects are.
The Intelligent Autonomous Systems group at University Bremen is responsible for coordinating this collaborative project. We create knowledge representation methods for integrating all the knowledge sources (web, vision, simulation, teaching), develop plan-based control methods for executing actions using a constraint-based controller and investigate plan transformations.
We believe in taking personal robotics to the next level through active and free sharing of knowledge. The methodology and software for (self)improvement of robots under “open world” conditions developed by Robohow will be made available to the community. We will also refactor and extend existing open source robotics software to support this and make results easily accessible.