Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/44301
Title: Interface Clarity in Human-Robot Interaction across Physical and Virtual Realities
Authors: VAN DEURZEN, Bram 
Advisors: Luyten, Kris
Ramakers, Raf
Issue Date: 2024
Abstract: The interaction with virtual interfaces, such as virtual reality (VR) and augmented reality (AR), is becoming increasingly prevalent in today’s world. Users frequently switch between physical interfaces (e.g., keyboards and mice, control panels) and virtual ones (e.g., hand gestures and eye-tracking). This constant switching is challenging because it requires users to master various types of devices. Especially for VR controllers, which require the users to learn the abstractions between the physical interaction with the controller and the corresponding action that occurs in the virtual environment. This interaction harmonization is particularly vital in human-robot interaction, where understanding robot capabilities and user responsibilities is essential. Virtual interactions with robots can significantly enhance training, design processes, and remote operations. With the increasing prevalence of robots in industries such as manufacturing, hospitality, healthcare, and domestic settings, understanding their behavior is essential for smooth interactions and building trust. Facilitating knowledge transfer between virtual and physical interactions ensures seamless human-robot interactions. Enhancing the intelligibility of these interactions is crucial for effective human-robot communication, allowing robots to convey their behaviors and actions clearly. The first part of this thesis classifies human-robot interactions to identify suitable instances for intelligibility and develops a framework and visual design space for creating intelligible visualizations. In the second part, I explore the use of VR interface panels combined with realistic haptic feedback. I conducted an in-depth study of button recognition to identify physical buttons that can represent all virtual buttons. In Part I, Choreobot is presented as a framework and online dashboard for classifying human-robot interactions and providing suggestions for intelligibility and feedforward. Additionally, RobotPixels is introduced as a visual design framework for human-robot interaction intelligibility visualizations, accompanied by a design space with example visualizations. Part II introduces HapticPanel, an open-source system for creating realistic haptic feedback for VR interfaces, enabling users to interact with a digital representation of a robot’s control panel as if it were physical. This research leads to Substitute Buttons, a comprehensive investigation into the characteristics of physical buttons and their influence on touch-based recognizability. The study identifies six distinct physical buttons that effectively represent a broader set of virtual buttons, ensuring users do not perceive a difference during interaction.
Document URI: http://hdl.handle.net/1942/44301
Category: T1
Type: Theses and Dissertations
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
PhDThesis_UHasselt_Bram_digital.pdf
  Until 2029-09-27
Published version69.52 MBAdobe PDFView/Open    Request a copy
Show full item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.