Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/41266
Title: Incorporating view-dependent effects into point clouds
Authors: Zoomers, Brent 
Advisors: MICHIELS, Nick
JORISSEN, Lode
Issue Date: 2023
Publisher: tUL
Abstract: This thesis explores the challenge of achieving truly realistic and immersive experiences in Extended Reality (XR) environments. We achieve this by combining the photorealistic aspect of images with the geometric accuracy of laser-scanned point clouds. Current techniques, such as real-time ray tracing, have limitations and often require expensive prerequisites. Our proposed method utilizes readily available images which can be captured using handheld devices and estimates accurate poses for uncalibrated frames. We then use them to determine a blending field based on the objectives of the Unstructured Lumigraph paper. This blending field is used to render view-dependent lighting effects on top of a point cloud. Our final implementation facilitates real-time rendering of view-dependent effects on point clouds at framerates high enough to be usable in XR-applications. We also provide an analysis of the different steps taken in our approach and argue why certain choices were made. to dutch please
Notes: master in de informatica
Document URI: http://hdl.handle.net/1942/41266
Category: T2
Type: Theses and Dissertations
Appears in Collections:Master theses

Files in This Item:
File Description SizeFormat 
dc45d3a2-d1fc-41be-8b00-8de9beec39f3.pdf85.23 MBAdobe PDFView/Open
Show full item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.