### Page Content

### to Navigation

# Robotics-Specific Machine Learning

This project will develop robotics-specific machine learning methods. The requirement for such methods follows directly from the no-free-lunch theorems (Wolpert, 1996) which prove that no machine learning method works better than random guessing when averaged over all possible problems. The only way to improve over random guessing is to restrict the problem space and incorporate prior knowledge about this problem space into the learning method.

Of course, there are machine learning methods that apply to a wide range of real world problems by incorporating fairly general priors, e.g. parsimony, smoothness, hierarchical structure, or distributed representation. However, even for solving relatively simple problems, such methods already require huge amounts of data and computation. The overall problem of robotics—learning behavior that maps a stream of high-dimensional sensory input to a stream of high-dimensional motor output from sparse feedback—is too complex to be solved by generic machine learning methods using realistic amounts of data and computation.

To tailor machine learning towards the problem space robotics, we have to do two things: a)

discover robotics-specific prior knowledge and b) incorporate these priors into machine learning

methods. Since robots interact with the physical world, physics is the most direct source of prior

knowledge. To incorporate such priors, we will relate them to state representations, which are

an intermediate result of the mapping from the robot’s sensory input to its motor output. The

intuition is the following: since intermediate state representations must reflect properties of the

world, the same physical laws that apply to the real world must also apply to these internal state

representations. Therefore, knowledge about these laws allows us to find state representations

that facilitate learning of robot behavior.

The project is funded by the Deutsche Forschungsgemeinschaft (DFG).

Contact persons: Adrian Sieler, Oliver Brock