Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/46140
Full metadata record
DC FieldValueLanguage
dc.contributor.authorVANHERCK, Joni-
dc.contributor.authorZOOMERS, Brent-
dc.contributor.authorMERTENS, Tom-
dc.contributor.authorJORISSEN, Lode-
dc.contributor.authorMICHIELS, Nick-
dc.date.accessioned2025-06-12T08:48:21Z-
dc.date.available2025-06-12T08:48:21Z-
dc.date.issued2025-
dc.date.submitted2025-05-21T07:22:46Z-
dc.identifier.citationThe Eurographics Association,-
dc.identifier.isbn9783038682684-
dc.identifier.issn1017-4656-
dc.identifier.urihttp://hdl.handle.net/1942/46140-
dc.description.abstractStatic LiDAR scanners produce accurate, dense, colored point clouds, but often contain obtrusive artifacts which makes them ill-suited for direct display. We propose an efficient method to render more perceptually realistic images of such scans without any expensive preprocessing or training of a scene-specific model. A naive projection of the point cloud to the output view using 1×1 pixels is fast and retains the available detail, but also results in unintelligible renderings as background points leak between the foreground pixels. The key insight is that these projections can be transformed into a more realistic result using a deep convolutional model in the form of a U-Net, and a depth-based heuristic that prefilters the data. The U-Net also handles LiDAR-specific problems such as missing parts due to occlusion, color inconsistencies and varying point densities. We also describe a method to generate synthetic training data to deal with imperfectly-aligned ground truth images. Our method achieves real-time rendering rates using an off-the-shelf GPU and outperforms the state-of-the-art in both speed and quality.-
dc.language.isoen-
dc.publisherThe Eurographics Association-
dc.subject.otherRendering-
dc.subject.otherNeural networks-
dc.titleReal-time Neural Rendering of LiDAR Point Clouds-
dc.typeProceedings Paper-
local.bibliographicCitation.conferencedate2025, May 12-16-
local.bibliographicCitation.conferencenameEurographics-
local.bibliographicCitation.conferenceplaceLondon, UK-
local.format.pages4-
local.bibliographicCitation.jcatC1-
local.type.refereedRefereed-
local.type.specifiedProceedings Paper-
local.type.programmehorizonEurope-
dc.identifier.doi10.2312/egs.20251041-
dc.identifier.urlhttps://diglib.eg.org/handle/10.2312/egs20251041-
local.provider.typePdf-
local.uhasselt.internationalno-
local.relation.horizonEurope101070072-
item.contributorVANHERCK, Joni-
item.contributorZOOMERS, Brent-
item.contributorMERTENS, Tom-
item.contributorJORISSEN, Lode-
item.contributorMICHIELS, Nick-
item.fullcitationVANHERCK, Joni; ZOOMERS, Brent; MERTENS, Tom; JORISSEN, Lode & MICHIELS, Nick (2025) Real-time Neural Rendering of LiDAR Point Clouds. In: The Eurographics Association,.-
item.fulltextWith Fulltext-
item.accessRightsOpen Access-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
egs20251041.pdfPublished version52.06 MBAdobe PDFView/Open
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.