On the Sensory Commutativity of Action Sequences for Embodied Agents

13 Feb 2020  ·  Hugo Caselles-Dupré, Michael Garcia-Ortiz, David Filliat ·

Perception of artificial agents is one the grand challenges of AI research. Deep Learning and data-driven approaches are successful on constrained problems where perception can be learned using supervision, but do not scale to open-worlds. In such case, for autonomous embodied agents with first-person sensors, perception can be learned end-to-end to solve particular tasks. However, literature shows that perception is not a purely passive compression mechanism, and that actions play an important role in the formulation of abstract representations. We propose to study perception for these embodied agents, under the mathematical formalism of group theory in order to make the link between perception and action. In particular, we consider the commutative properties of continuous action sequences with respect to sensory information perceived by such an embodied agent. We introduce the Sensory Commutativity Probability (SCP) criterion which measures how much an agent's degree of freedom affects the environment in embodied scenarios. We show how to compute this criterion in different environments, including realistic robotic setups. We empirically illustrate how SCP and the commutative properties of action sequences can be used to learn about objects in the environment and improve sample-efficiency in Reinforcement Learning.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here