Please use this identifier to cite or link to this item:
Title: Touch-based Interaction and Collaboration in Walk-up-and-use and Multi-user Environments
Authors: VANACKEN, Davy 
Advisors: CONINX, Karin
Issue Date: 2012
Abstract: Touch-based interfaces are becoming increasingly ubiquitous: they can be found on a variety of hardware platforms, ranging from mobile phones to large public displays, and they allow a wide variety of applications, from singleuser casual games to multi-user tools that facilitate brainstorming. Although touch-based interaction is supposed to be intuitive and \natural", it nonetheless imposes speci c requirements on the accessibility of a user interface. Due to a lack of common conventions and consistency across applications, gestures are often di cult to discover and learn. Furthermore, since most touchsensitive hardware supports multi-touch input nowadays, it allows multiple people to interact simultaneously on a shared surface. This not only brings about new opportunities with regard to collaboration, but also new research challenges on how to support this kind of collaboration e ectively. In walk-up-and-use environments, accessibility of the user interface is of great importance, as the limited interaction time and need for immediate use of the system do not allow for much training or exploration. Therefore, we explore the concept of making touch-based interfaces in walk-up-and-use environments self-explanatory, for both single-user and multi-user settings. With TouchGhosts, we propose a help system that demonstrates interaction techniques to the users, for instance through visual \guides" such as animated virtual hands. The graphical nature of our approach allows a clear view on the synchronization of multiple inputs, which are typical for multi-touch interaction. User studies indicate that animated help allows users to quickly discover the available interaction possibilities, with a positive e ect on collaboration, as users work together to learn the application. When multiple users interact simultaneously in a highly collaborative setting, additional challenges emerge, especially if the environment can include both co-located and remote participants. Collaboration may cause con icts and misconducts, for instance. Therefore, collaborative environments are in iv Abstract need of oor control policies that resolve and prevent such problems gracefully, without interrupting the dynamic work ow. We apply a Focus+Roles approach to a digital meeting system, iConnect: the user's roles de ne a user's access rights and privileges during particular activities, while tracking the users' focus provides a means of handling problems associated with the typical lack of mutual awareness. Continuing our research on collaborative systems, we investigate the requirements of a storyboarding tool to support the various disciplines in a multidisciplinary team. Storyboards are well suited to attain a common understanding during user-centered software design and development, because they allow each team member to contribute to the decision making process. To take full advantage of the di erent viewpoints and approaches that members of a multidisciplinary team bring to the table, we explore how such teams create storyboards through an observational study. Based on the lessons learned from this study, we formulate a set of requirements to inform the design of a tabletop tool for collaborative storyboarding. Throughout our exploration of collaborative systems, we repeatedly encountered the need for identi cation of the di erent users around a shared surface. Multi-touch hardware can track multiple inputs simultaneously, but the majority of those systems are unable to associate contact points with speci c users. Therefore, we present Carpus, a non-intrusive identi cation technique for mapping touches to their corresponding user by analyzing the back of the users' hands. This feature can improve a multi-user interface in a number of ways, for instance by customizing help on a per-user basis, or by tracking a user's activities to prevent con icts or unequal contributions. Another aspect to consider, is the design and evaluation of touch-based and multi-user interaction. We investigate how model-based design can facilitate the development process by modeling environments through the use of highlevel diagrams. In this context, we discuss NiMMiT, a graphical notation for expressing and evaluating multimodal user interaction. Because NiMMiT is presently focused on single-user 3D virtual environments, we explore how NiMMiT can be extended beyond these boundaries by re ecting on its current limitations with regard to touch-based and multi-user interaction. In summary, our main contributions include making touch-based interfaces self-explanatory in single-user and multi-user walk-up-and-use environments, and providing con ict-free multi-user environments that can non-intrusively identify the user behind each action, for instance to facilitate collaborative storyboarding. Furthermore, we present a model-based approach to design and evaluate multimodal interaction techniques.
Document URI:
Category: T1
Type: Theses and Dissertations
Appears in Collections:PhD theses
Research publications

Files in This Item:
File Description SizeFormat 
Touch-based Interaction and Collaboration in Walk-up-and-use and Multi-user Environments.pdfVanacken Davy PhD38.11 MBAdobe PDFView/Open
Show full item record

Page view(s)

checked on Jul 4, 2022


checked on Jul 4, 2022

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.