AI-HRI 2020 — 2020 AAAI Fall Symposium on The Artificial Intelligence for Human-Robot Interaction (AI-HRI)

Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Zhao Han, Alexander Wilkinson, Jenna Parrillo, Jordan Allspaw, and Holly A. Yanco

, , ,
table top projection
Augmented reality (AR), tabletop projection mapping
News
  • Sep 10, 2020

    Our paper is accepted to The AAAI Fall Symposium on The Artificial Intelligence for Human-Robot Interaction (AI-HRI '20)!

Abstract

Existing research on non-verbal cues, e.g., eye gaze or arm movement, may not accurately present a robot’s internal states such as perception results and action intent.

Projecting the states directly onto a robot’s operating environment has the advantages of being direct, accurate, and more salient, eliminating mental inference about the robot’s intention.

However, there is a lack of tools for projection mapping in robotics, compared to established motion planning libraries (e.g., MoveIt).

In this paper, we detail the implementation of projection mapping to enable researchers and practitioners to push the boundaries for better interaction between robots and humans.

We also provide practical documentation and code for a sample manipulation projection mapping on GitHub: github.com/uml-robotics/projection_mapping.

Figures

hardware platform fetch
Hardware platform.

Videos

(Accompanying video). The projection of perception results: the detected objects (white and green) and the object to be manipulated (green).
Whiteboard projection