AI-HRI 2022

Mixed-Reality Robot Behavior Replay: A System Implementation

As robots become increasingly complex, they must explain their behaviors to gain trust and acceptance. However, it may be difficult through verbal explanation alone to fully convey information about past behavior, especially regarding objects no longer present due to robots’ or humans’ actions. Humans often try to physically mimic past movements to accompany verbal… Continue reading Mixed-Reality Robot Behavior Replay: A System Implementation

Published

HRI 2022

Projecting Robot Navigation Paths: Hardware and Software for Projected AR

Navigation Intent, projection mapping

For mobile robots, mobile manipulators, and autonomous vehicles to safely navigate around populous places such as streets and warehouses, human observers must be able to understand their navigation intent. One way to enable such understanding is by visualizing this intent through projections onto the surrounding environment. But despite the demonstrated effectiveness of such projections,… Continue reading Projecting Robot Navigation Paths: Hardware and Software for Projected AR

Published

AI-HRI 2020

Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Augmented reality (AR), tabletop projection mapping

Existing research on non-verbal cues, e.g., eye gaze or arm movement, may not accurately present a robot’s internal states such as perception results and action intent. Projecting the states directly onto a robot’s operating environment has the advantages of being direct, accurate, and more salient, eliminating mental inference about the robot’s intention. However, there… Continue reading Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Published