Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/12014
Full metadata record
DC FieldValueLanguage
dc.contributor.authorOCTAVIA, Johanna-
dc.contributor.authorRAYMAEKERS, Chris-
dc.contributor.authorCONINX, Karin-
dc.date.accessioned2011-06-23T07:15:50Z-
dc.date.availableNO_RESTRICTION-
dc.date.available2011-06-23T07:15:50Z-
dc.date.issued2011-
dc.identifier.citationMULTIMEDIA TOOLS AND APPLICATIONS, 54(1). p. 121-142-
dc.identifier.issn1380-7501-
dc.identifier.urihttp://hdl.handle.net/1942/12014-
dc.description.abstractWhen interacting in a virtual environment, users are confronted with a number of interaction techniques. These interaction techniques may complement each other, but in some circumstances can be used interchangeably. Because of this situation, it is difficult for the user to determine which interaction technique to use. Furthermore, the use of multimodal feedback, such as haptics and sound, has proven beneficial for some, but not all, users. This complicates the development of such a virtual environment, as designers are not sure about the implications of the addition of interaction techniques and multimodal feedback. A promising approach for solving this problem lies in the use of adaptation and personalization. By incorporating knowledge of a user's preferences and habits, the user interface should adapt to the current context of use. This could mean that only a subset of all possible interaction techniques is presented to the user. Alternatively, the interaction techniques themselves could be adapted, e.g. by changing the sensitivity or the nature of the feedback. In this paper, we propose a conceptual framework for realizing adaptive personalized interaction in virtual environments. We also discuss how to establish, verify and apply a user model, which forms the first and important step in implementing the proposed conceptual framework. This study results in general and individual user models, which are then verified to benefit users interacting in virtual environments. Furthermore, we conduct an investigation to examine how users react to a specific type of adaptation in virtual environments (i.e. switching between interaction techniques). When an adaptation is integrated in a virtual environment, users positively respond to this adaptation as their performance significantly improve and their level of frustration decrease.-
dc.language.isoen-
dc.publisherSPRINGER-
dc.subject.otherVirtual environments; Adaptation; Framework; User model-
dc.titleAdaptation in virtual environments: conceptual framework and user models-
dc.typeJournal Contribution-
dc.identifier.epage142-
dc.identifier.issue1-
dc.identifier.spage121-
dc.identifier.volume54-
local.format.pages22-
local.bibliographicCitation.jcatA1-
dc.description.notes[Octavia, Johanna Renny; Raymaekers, Chris; Coninx, Karin] Hasselt Univ TUL IBBT, Expertise Ctr Digital Media, B-3590 Diepenbeek, Belgium.-
local.type.refereedRefereed-
local.type.specifiedArticle-
dc.bibliographicCitation.oldjcatA1-
dc.identifier.doi10.1007/s11042-010-0525-z-
dc.identifier.isi000291061100007-
item.accessRightsOpen Access-
item.fullcitationOCTAVIA, Johanna; RAYMAEKERS, Chris & CONINX, Karin (2011) Adaptation in virtual environments: conceptual framework and user models. In: MULTIMEDIA TOOLS AND APPLICATIONS, 54(1). p. 121-142.-
item.contributorOCTAVIA, Johanna-
item.contributorRAYMAEKERS, Chris-
item.contributorCONINX, Karin-
item.fulltextWith Fulltext-
item.validationecoom 2012-
crisitem.journal.issn1380-7501-
crisitem.journal.eissn1573-7721-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
conceptual framework.pdfPeer-reviewed author version1.17 MBAdobe PDFView/Open
Show simple item record

SCOPUSTM   
Citations

9
checked on Sep 3, 2020

WEB OF SCIENCETM
Citations

4
checked on Apr 22, 2024

Page view(s)

104
checked on Sep 6, 2022

Download(s)

208
checked on Sep 6, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.