Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/14245
Title: Carpus: A Non-Intrusive User Identification Technique for Interactive Surfaces
Authors: RAMAKERS, Raf 
VANACKEN, Davy 
LUYTEN, Kris 
SCHOENING, Johannes 
CONINX, Karin 
Issue Date: 2012
Publisher: ACM
Source: Miller, Rob; Benko, Hrvoje; Latulipe, Celine (Ed.). Proceedings of the UIST 2012 the 25th ACM Symposium on User Interfac Software and Technology, p. 35-44
Abstract: Interactive surfaces have great potential for co-located collaboration because of their ability to track multiple inputs simultaneously. However, the multi-user experience on these devices could be enriched significantly if touch points could be associated with a particular user. Existing approaches to user identification are intrusive, require users to stay in a fixed position, or suffer from poor accuracy. We present a non-intrusive, high-accuracy technique for mapping touches to their corresponding user in a collaborative environment. By mounting a high-resolution camera above the interactive surface, we are able to identify touches reliably without any extra instrumentation, and users are able to move around the surface at will. Our technique, which leverages the back of users’ hands as identifiers, supports walk-up-and-use situations in which multiple people interact on a shared surface.
Keywords: Interactive tabletops; surface computing; multi-touch interaction; multi-user applications; user identification.
Document URI: http://hdl.handle.net/1942/14245
ISBN: 978-1-4503-1580-7
DOI: 10.1145/2380116.2380123
ISI #: 000324815300005
Category: C1
Type: Proceedings Paper
Validations: ecoom 2014
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
Carpus.pdfCarpus2.51 MBAdobe PDFView/Open
Show full item record

WEB OF SCIENCETM
Citations

17
checked on May 21, 2022

Page view(s)

32
checked on May 20, 2022

Download(s)

8
checked on May 20, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.