Markerless Perspective Taking for Humanoid Robots in Unconstrained Environments

Tobias Fischer, and Yiannis Demiris
Imperial College London
Personal Robotics Lab

Framework overview

System overview
Overall flow of the proposed method. The inputs to the perspective taking pipeline are images acquired from a RGB-D camera, and the iCub eyes. In the first step, the robot recognizes objects, estimates the head pose of surrounding humans, and maps the environment. Two separate processes are employed for level 1 and level 2 perspective taking; allowing the robot to infer which objects are seen by the human, what the spatial location of these objects are in the reference frame of the human, and how the world appears from the human viewpoint.

Abstract

Perspective taking enables humans to imagine the world from another viewpoint. This allows reasoning about the state of other agents, which in turn is used to more accurately predict their behavior. In this paper, we equip an iCub humanoid robot with the ability to perform visuospatial perspective taking (PT) using a single depth camera mounted above the robot. Our approach has the distinct benefit that the robot can be used in unconstrained environments, as opposed to previous works which employ marker-based motion capture systems. Prior to and during the PT, the iCub learns the environment, recognizes objects within the environment, and estimates the gaze of surrounding humans. We propose a new head pose estimation algorithm which shows a performance boost by normalizing the depth data to be aligned with the human head. Inspired by psychological studies, we employ two separate mechanisms for the two different types of PT. We implement line of sight tracing to determine whether an object is visible to the humans (level 1 PT). For more complex PT tasks (level 2 PT), the acquired point cloud is mentally rotated, which allows algorithms to reason as if the input data was acquired from an egocentric perspective. We show that this can be used to better judge where object are in relation to the humans. The multifaceted improvements to the PT pipeline advance the state of the art, and move PT in robots to markerless, unconstrained environments.

Paper

Open Access Download
Link to IEEE Xplore

Video

Code

Source Code Download

If you want to use the source code, you need to clone the WYSIWYD repository.

Bibtex

BibTeX Download

Poster

ICRA2016 poster

Acknowledgements

This research was funded in part by the EU projects WYSIWYD (grant FP7-ICT-612139). We would like to thank the members of the Personal Robotics Lab for their continued support!