Abstraction by Structure: Learning Representations of Objects and Actions
Carl Henrik Ek, KTH
Abstract:
For humans and robots alike, objects in the environment provide context for interactions, tools for executing tasks and means for grounding semantics. In robotics, an important open problem is to detect, recognise and model objects given sensory data. Central to solving this problem is to represent and parametrise sensory data so to provide fast, robust and scalable solutions. In this talk, we will discuss representations facilitating global or structural information. We argue that, rather than focusing on building models capable of representing larger portions of the variance in the sensory input, we should aim to carefully consider what information is actually relevant for the problem at hand. We will motivate and present the approach for several different applications concerned with generalisation on a level where the global structure is the dominant discriminating factor. We will present a set of different scenarios where structural representations and models are of key importance for achieving fast and robust performance.