Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/13872
Title: Immersive GPU-driven biological adaptive stereoscopic rendering
Authors: ROGMANS, Sammy 
DUMONT, Maarten 
LAFRUIT, Gauthier 
BEKAERT, Philippe 
Issue Date: 2011
Source: Proceedings of 3D Stereo Media
Abstract: In this paper, we want to sensitize 3D content developers and researchers of broadening their scope of parameters they take into account for generating 3D content. State-of-theart perceptual research has already shown that monocular visual cues highly contribute to the very fundamentals of 3D perception, and binocular ones are merely linked to them in order to create a rich depth experience. In this context, we present an overview of the research concerning our teleconferencing system that is able to recreate biological stereoscopic input, without loosing consistency in all related monocular cues such as accommodation, occlusion, size (gradient), motion parallax, texture gradient and linear perspective. The system adapts in real-time by doing both GPU-driven analysis and rendering, based on the physical parameters of the system user.
Document URI: http://hdl.handle.net/1942/13872
Category: C2
Type: Proceedings Paper
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
srogmans-3dsm2010.pdf3.8 MBAdobe PDFView/Open
Show full item record

Page view(s)

46
checked on Aug 6, 2023

Download(s)

12
checked on Aug 6, 2023

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.