Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/43033
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMICHIELS, Nick-
dc.contributor.authorJORISSEN, Lode-
dc.contributor.authorPUT, Jeroen-
dc.contributor.authorLIESENBORGS, Jori-
dc.contributor.authorVandebroeck, Isjtar-
dc.contributor.authorJoris , Eric-
dc.contributor.authorVAN REETH, Frank-
dc.date.accessioned2024-06-03T07:37:29Z-
dc.date.available2024-06-03T07:37:29Z-
dc.date.issued2024-
dc.date.submitted2024-05-28T11:48:39Z-
dc.identifier.citationVIRTUAL REALITY, 28 (2) (Art N° 106)-
dc.identifier.issn1359-4338-
dc.identifier.urihttp://hdl.handle.net/1942/43033-
dc.description.abstractExtended reality (XR) experiences are on the verge of becoming widely adopted in diverse application domains. An essential part of the technology is accurate tracking and localization of the headset to create an immersive experience. A subset of the applications require perfect co-location between the real and the virtual world, where virtual objects are aligned with real-world counterparts. Current headsets support co-location for small areas, but suffer from drift when scaling up to larger ones such as buildings or factories. This paper proposes tools and solutions for this challenge by splitting up the simultaneous localization and mapping (SLAM) into separate mapping and localization stages. In the pre-processing stage, a feature map is built for the entire tracking area. A global optimizer is applied to correct the deformations caused by drift, guided by a sparse set of ground truth markers in the point cloud of a laser scan. Optionally, further refinement is applied by matching features between the ground truth keyframe images and their rendered-out SLAM estimates of the point cloud. In the second, real-time stage, the rectified feature map is used to perform localization and sensor fusion between the global tracking and the headset. The results show that the approach achieves robust co-location between the virtual and the real 3D environment for large and complex tracking environments.-
dc.description.sponsorshipThis research was funded by the European Union (HORIZON MAX-R, Mixed Augmented and Extended Reality Media Pipeline, 101070072) and the Flanders Make’s XRTwin SBO project (R-12528).-
dc.language.isoen-
dc.publisherSPRINGER LONDON LTD-
dc.rightsThe Author(s) 2024. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.-
dc.subject.otherTracking-
dc.subject.otherColocation-
dc.subject.otherLarge-area-
dc.subject.otherPoint cloud-
dc.subject.otherRegistration-
dc.titleTracking and co-location of global point clouds for large-area indoor environments-
dc.typeJournal Contribution-
dc.identifier.issue2-
dc.identifier.volume28-
local.format.pages16-
local.bibliographicCitation.jcatA1-
dc.description.notesMichiels, N (corresponding author), Hasselt Univ Flanders Make, Expertise Ctr Digital Media, Wetenschapspk 2, B-3590 Diepenbeek, Belgium.-
dc.description.notesnick.michiels@uhasselt.be; lode.jorissen@uhasselt.be;-
dc.description.notesjeroen.put@uhasselt.be; jori.liesenborgs@uhasselt.be;-
dc.description.notesisjtar@crew.brussels; eric.joris@crew.brussels;-
dc.description.notesfrank.vanreeth@uhasselt.be-
local.publisher.place236 GRAYS INN RD, 6TH FLOOR, LONDON WC1X 8HL, ENGLAND-
local.type.refereedRefereed-
local.type.specifiedArticle-
local.bibliographicCitation.artnr106-
local.type.programmehorizonEurope-
dc.identifier.doi10.1007/s10055-024-01004-0-
dc.identifier.isi001221464700001-
dc.identifier.eissn1434-9957-
local.provider.typewosris-
local.description.affiliation[Michiels, Nick; Jorissen, Lode; Put, Jeroen; Liesenborgs, Jori; Van Reeth, Frank] Hasselt Univ Flanders Make, Expertise Ctr Digital Media, Wetenschapspk 2, B-3590 Diepenbeek, Belgium.-
local.description.affiliation[Vandebroeck, Isjtar; Joris, Eric] CREW Brussels, CREW, Vandernootstr 23-8, B-1080 Sint Jans Molenbeek, Belgium.-
local.uhasselt.internationalno-
local.relation.horizonEurope101070072-
item.contributorMICHIELS, Nick-
item.contributorJORISSEN, Lode-
item.contributorPUT, Jeroen-
item.contributorLIESENBORGS, Jori-
item.contributorVandebroeck, Isjtar-
item.contributorJoris , Eric-
item.contributorVAN REETH, Frank-
item.fullcitationMICHIELS, Nick; JORISSEN, Lode; PUT, Jeroen; LIESENBORGS, Jori; Vandebroeck, Isjtar; Joris , Eric & VAN REETH, Frank (2024) Tracking and co-location of global point clouds for large-area indoor environments. In: VIRTUAL REALITY, 28 (2) (Art N° 106).-
item.fulltextWith Fulltext-
item.accessRightsOpen Access-
crisitem.journal.issn1359-4338-
crisitem.journal.eissn1434-9957-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
s10055-024-01004-0.pdfPublished version2.42 MBAdobe PDFView/Open
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.