Evaluating Multimodal Feedback for Assembly Tasks in a Virtual Environment

Abstract

Operating power tools over extended periods of time can pose significant risks to humans, due to the strong forces and vibrations they impart to the limbs. Telemanipulation systems can be employed to minimize these risks, but may impede effective task performance due to the reduced sensory cues they typically convey. To address this shortcoming, we explore the benefits of augmenting these cues with the addition of audition, vibration, and force feedback, and evaluate them on users’ performance in a VR mechanical assembly task employing a simulated impact wrench. Our research focuses on the utility of vibrotactile feedback, rendered as a simplified and attenuated version of the vibrations experienced while operating an actual impact wrench. We investigate whether such feedback can serve to enhance the operator’s awareness of the state of the tool, as well as a proxy for the forces experienced during collisions and coupling, while operating the tool an actual impact wrench. Results from our user study comparing feedback modalities confirm that the introduction of vibrotactile, in addition to auditory feedback can significantly improve user performance as assessed by completion time. However, the addition of force feedback to these two modalities did not further improve performance.

Publication
Proceedings of the ACM on Human-Computer Interaction