THRI, 2023

Communicating Missing Causal Information to Explain a Robot’s Past Behavior

Robots need to explain their behavior to gain trust. Existing research has focused on explaining a robot’s current behavior, yet it remains unknown yet challenging how to provide explanations of past actions in an environment that might change after a robot’s actions, leading to critical missing causal information due to moved objects. We conducted… Continue reading Communicating Missing Causal Information to Explain a Robot’s Past Behavior

Published

THRI, 2023

Best of Both Worlds? Combining Different Forms of Mixed Reality Deictic Gestures

Mixed Reality provides a powerful medium for transparent and effective human-robot communication, especially for robots with significant physical limitations (e.g., those without arms). To enhance nonverbal capabilities for armless robots, this paper presents two studies that explore two different categories of mixed-reality deictic gestures for armless robots: a virtual arrow positioned over a target… Continue reading Best of Both Worlds? Combining Different Forms of Mixed Reality Deictic Gestures

Published

HRI 2023

Crossing Reality: Comparing Physical and Virtual Robot Deixis

We investigate referring behavior at the intersection of physical and AR worlds: physical/virtual (AR) arm × physical/virtual (AR) referent.

Augmented Reality (AR) technologies present an exciting new medium for human-robot interactions, enabling new opportunities for both implicit and explicit human-robot communication. For example, these technologies enable physically-limited robots to execute non-verbal interaction patterns such as deictic gestures despite lacking the physical morphology necessary to do so. However, a wealth of HRI research has… Continue reading Crossing Reality: Comparing Physical and Virtual Robot Deixis

Published

AI-HRI 2022

Mixed-Reality Robot Behavior Replay: A System Implementation

As robots become increasingly complex, they must explain their behaviors to gain trust and acceptance. However, it may be difficult through verbal explanation alone to fully convey information about past behavior, especially regarding objects no longer present due to robots’ or humans’ actions. Humans often try to physically mimic past movements to accompany verbal… Continue reading Mixed-Reality Robot Behavior Replay: A System Implementation

Published

VAM-HRI 2022

Towards an Understanding of Physical vs Virtual Robot Appendage Design

Artist's rendering. One of the four conditions, AR→P: Physical Robot with a AR virtual arm pointing to a physical referent. See Figure 1 for all four conditions.

Augmented Reality (AR) or Mixed Reality (MR) enables innovative interactions by overlaying virtual imagery over the physical world. For roboticists, this creates new opportunities to apply proven non-verbal interaction patterns, like gesture, to physically-limited robots. However, a wealth of HRI research has demonstrated that there are real benefits to physical embodiment (compared, e.g., to… Continue reading Towards an Understanding of Physical vs Virtual Robot Appendage Design

Published

HRI 2022

Projecting Robot Navigation Paths: Hardware and Software for Projected AR

Navigation Intent, projection mapping

For mobile robots, mobile manipulators, and autonomous vehicles to safely navigate around populous places such as streets and warehouses, human observers must be able to understand their navigation intent. One way to enable such understanding is by visualizing this intent through projections onto the surrounding environment. But despite the demonstrated effectiveness of such projections,… Continue reading Projecting Robot Navigation Paths: Hardware and Software for Projected AR

Published

Paladyn, 2021

Design Guidelines for Human-Robot Interaction with Assistive Robot Manipulation Systems

Assistive robotics

The design of user interfaces for assistive robot systems can be improved through the use of a set of design guidelines presented in this paper. As an example use, the paper presents two different user interface designs for an assistive manipulation robot system. We explore the design considerations from these two contrasting user interfaces.… Continue reading Design Guidelines for Human-Robot Interaction with Assistive Robot Manipulation Systems

Published

AI-HRI 2020

Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Augmented reality (AR), tabletop projection mapping

Existing research on non-verbal cues, e.g., eye gaze or arm movement, may not accurately present a robot’s internal states such as perception results and action intent. Projecting the states directly onto a robot’s operating environment has the advantages of being direct, accurate, and more salient, eliminating mental inference about the robot’s intention. However, there… Continue reading Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Published