Paladyn, 2021 — Paladyn, Journal of Behavioral Robotics, 12(1), September, 2021

Design Guidelines for Human-Robot Interaction with Assistive Robot Manipulation Systems

Alexander Wilkinson, Michael Gonzales, Patrick Hoey, David Kontak, Dian Wang, Noah Torname, Sam Laderoute, Zhao Han, Jordan Allspaw, Robert Platt, and Holly A. Yanco

, ,
paladyn scooter poster
Assistive robotics
News

Abstract

The design of user interfaces for assistive robot systems can be improved through the use of a set of design guidelines presented in this paper.

As an example use, the paper presents two different user interface designs for an assistive manipulation robot system. We explore the design considerations from these two contrasting user interfaces.

The first is referred to as the graphical user interface, which the user operates entirely through a touchscreen as a representation of the state of the art.

The second is a type of novel user interface referred to as the tangible user interface. The tangible user interface makes use of devices in the real world, such as laser pointers and a projector-camera system that enables augmented reality.

Each of these interfaces is designed to allow the system to be operated by an untrained user in an open environment such as a grocery store.

Our goal is for these guidelines to aid researchers in the design of human-robot interaction for assistive robot systems, particularly when designing multiple interaction methods for direct comparison.

Figures

Assistive robot manipulation system

The assistive robot system, with a Universal Robots UR5 robot arm outfitted with a Robotiq 2F-85 gripper mounted on a Merits Pioneer 10 mobility scooter.
The assistive robot system, with a Universal Robots UR5 robot arm outfitted with a Robotiq 2F-85 gripper mounted on a Merits Pioneer 10 mobility scooter.
First person point of view from the assistive robot system showing the touchscreen for control of the GUI, and button panel for control of the TUI.
First person point of view from the assistive robot system showing the touchscreen for control of the GUI, and button panel for control of the TUI.

Projection Mapping

Grasp radius visualization through projection mapping. Objects that can be grasped are highlighted green, and objects that are out of reach of the arm are highlighted red.
Grasp radius visualization through projection mapping. Objects that can be grasped are highlighted green, and objects that are out of reach of the arm are highlighted red.
Projector being used to highlight an object in the world. Here it is shown highlighting an object that has been selected by the user.
Projector being used to highlight an object in the world. Here it is shown highlighting an object that has been selected by the user.

Graphical User Interface (GUI)

Main Menu of the GUI. From here the user can either drive the scooter, or begin a pick process.
Main Menu of the GUI. From here the user can either drive the scooter, or begin a pick process.
Interactive point cloud representing the real world. Interactive point cloud representing the real world. The user can tap on an area to zoom in, and then tap on an object to select it.
Interactive point cloud representing the real world. Interactive point cloud representing the real world. The user can tap on an area to zoom in, and then tap on an object to select it.
The GUI is showing the selected object in green, and asking the user to confirm that the correct object is selected. This is the Confirmation state in Figure 3
The GUI is showing the selected object in green, and asking the user to confirm that the correct object is selected. This is the Confirmation state in Figure 3

Tangible User Interface (TUI)

Joystick and panel of buttons used to control TUI. The buttons can light up or flash to indicate actions that the user can take.
Joystick and panel of buttons used to control TUI. The buttons can light up or flash to indicate actions that the user can take.
Laser device mounted to the end effector of the UR5. The user can use a joystick to move the lasers to point at objects.
Laser device mounted to the end effector of the UR5. The user can use a joystick to move the lasers to point at objects.

Video


Leave a comment

Your email address will not be published.