In this challenge, we proposed an augmented reality (AR) project titled “Human-Aware Projection Mapping for Robot Intention”. Leading a team of three, we were awarded $2,500 seed funding and this project led to two scientific publications.
Contents
Competition Proposal
Robots are now increasingly deployed in areas ranging from factories and warehouses to private homes. Externalizing a robot’s internal states to reveal its intentions improves understanding of the robot and builds trust. Displaying robot information directly on the environment with a mounted projector allows for accurate externalization, improving productivity and safety by in-place monitoring. However, the surface that the projector projects to might not be visible within the human’s line of sight. By adding human awareness, the projection will be made to surfaces that are visible to people.
We plan to fuse data from a 3D ToF LiDAR and an infrared light angle sensor from Analog Devices, to be mounted onto a robot’s head by actuators, with an RGB-D sensor already attached to the robot, which serves as the robot’s eyes. A projector will also be mounted onto the head or the shoulder area to project the robot’s internal states, detected objects or navigation path for example. The LiDAR will be used to detect human pose in real-time – especially eye gaze. The pose of the projector will be adjusted to possibly project to other areas like a wall or a divider so the human can see. To supplement the human pose detection by LiDAR and increase accuracy and responsiveness, the light angle sensor will be used to detect infrared light attached to a safety helmet of a worker or any hat for other robot users.
Outcome
Participation in this competition led to two publication:
- Tabletop projection published at AI-HRI in 2020
- Navigation intent projection published at HRI in 2022