Please use this identifier to cite or link to this item:
http://hdl.handle.net/1942/10264
Title: | Efficient distribution of emotion-related data through a networked virtual environment architecture | Authors: | QUAX, Peter DI FIORE, Fabian LAMOTTE, Wim VAN REETH, Frank |
Issue Date: | 2009 | Publisher: | JOHN WILEY & SONS LTD | Source: | COMPUTER ANIMATION AND VIRTUAL WORLDS, 20(5-6). p. 501-510 | Abstract: | In this paper we describe the way in which emotion-related data can efficiently be exchanged between participants in a large-scale networked virtual environment. This type of metadata is extracted from real-time captured video streams using off-the-shelf webcams and applied onto a two-dimensional (2D) stylised avatar; thereby improving the immersion the user experiences while navigating and communicating in the virtual world. As emotion-related data-once processed through the system-can be considered a specific type of state information, a generic networked virtual environment architecture can be used to distribute the information between participants. We have opted to extend the in-house developed architecture for large-scale virtual interactive communities (ALVIC-NG) architecture to be able to process the information flows. We Will show that the inclusion of this new type of information does not have a detrimental effect on the scalability of the system. Copyright (C) 2009 John Wiley & Sons, Ltd. | Notes: | [Di Fiore, Fabian] Hasselt Univ, tUL IBBT, Expertise Ctr Digital Media, BE-3590 Diepenbeek, Belgium. [Di Fiore, Fabian] Univ Antwerp, Antwerp, Belgium. | Keywords: | networked virtual environments; immersive communication; scalability; facial animation; emotions; avatars; MPEG-4 | Document URI: | http://hdl.handle.net/1942/10264 | ISSN: | 1546-4261 | e-ISSN: | 1546-427X | DOI: | 10.1002/cav.278 | ISI #: | 000271559700003 | Category: | A1 | Type: | Journal Contribution | Validations: | ecoom 2010 |
Appears in Collections: | Research publications |
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.