Learning to Guide Human Attention on Mobile Telepresence Robots with 360-degree Vision

Kishan Chandan, Jack Albertson, Xiaohan Zhang, Shiqi Zhang


IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021

Abstract

Mobile telepresence robots (MTRs) allow people to navigate and interact with a remote environment that is in a place other than the person's true location. Thanks to the recent advances in 360-degree vision, many MTRs are now equipped with an all-degree visual perception capability. However, people's visual field horizontally spans only about 120-degree of the visual field captured by the robot. To bridge this observability gap toward human-MTR shared autonomy, we have developed a framework, called GHAL360, to enable the MTR to learn a goal-oriented policy from reinforcements for guiding human attention using visual indicators. Three telepresence environments were constructed using datasets that are extracted from Matterport3D and collected from a real robot respectively. Experimental results show that GHAL360 outperformed the baselines from the literature in the efficiency of a human-MTR team completing target search tasks.

Car with Controller

GHAL360 Framework


Presentation Video