Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/47321
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMijnendonckx, Yara-
dc.contributor.authorOVERDULVE, Kristof-
dc.contributor.authorMICHIELS, Nick-
dc.date.accessioned2025-09-15T13:49:18Z-
dc.date.available2025-09-15T13:49:18Z-
dc.date.issued2025-
dc.date.submitted2025-08-29T09:08:31Z-
dc.identifier.citationExtended Reality: Proceedings, p. 58 -71-
dc.identifier.isbn978-3-031-97762-6-
dc.identifier.isbn978-3-031-97763-3-
dc.identifier.issn0302-9743-
dc.identifier.issn1611-3349-
dc.identifier.urihttp://hdl.handle.net/1942/47321-
dc.description.abstractSpeech Supported by Gestures, in Dutch Spreken met On-dersteuning van Gebaren (SMOG), is a Belgian sign system that enhances verbal communication for individuals with communicative disabilities through specific gestures. Although effective, SMOG training is labor-intensive and typically requires one-on-one instruction. Virtual reality (VR) training can make the practice of SMOG gestures more scalable, playful, and cost-effective. For such a VR training to be effective , trainees should receive accurate and timely automated feedback on whether they perform the correct gestures. However, since SMOG and many other specialized motor skills are niche problems, there are no large annotated datasets to train machine learning models to perform this task from scratch, and collecting large amounts of data for such niche tasks is infeasible. We therefore propose using transfer learning to fine-tune pre-trained Mobile Video Networks (MoViNets) on a small dataset of RGB videos showing SMOG gestures. Through this workflow, we demonstrate recognition accuracies exceeding 99% using only two to five samples per gesture for training. This work therefore not only advances accessible SMOG training through autonomous VR practice but also establishes a highly data-and computation-efficient machine-learning framework for recognizing other niche sign systems or motor skills using limited amounts of training data.-
dc.description.sponsorshipWe thank Yasmine Wauthier from the Expertise Center for Lifelong Learning and Innovation (OLLI), the Orthopedagogical Guidance Graduate Program of the AP University of Applied Arts and Sciences, and SMOG vzw for their support. This work was made possible with the support of MAXVR-INFRA, a scalable and flexible infrastructure that facilitates the transition to digital-physical work environments.-
dc.language.isoen-
dc.rightsThe Author(s), under exclusive license to Springer Nature Switzerland AG 2026-
dc.subject.otherSign System Recognition-
dc.subject.otherVR training-
dc.subject.otherMachine learning-
dc.subject.otherTransfer Learning-
dc.subject.otherSMOG-
dc.titleLeveraging Transfer Learning for Niche Sign System Recognition in VR Training with Limited Data-
dc.typeProceedings Paper-
local.bibliographicCitation.conferencedate2025, June 17-20-
local.bibliographicCitation.conferencenameInternational Conference on Extended Reality (XR Salento)-
local.bibliographicCitation.conferenceplaceOtranto, Italy-
dc.identifier.epage71-
dc.identifier.spage58-
local.bibliographicCitation.jcatC1-
local.type.refereedRefereed-
local.type.specifiedProceedings Paper-
local.relation.ispartofseriesnr15737-
dc.identifier.doi10.1007/978-3-031-97763-3_5-
local.provider.typeCrossRef-
local.bibliographicCitation.btitleExtended Reality: Proceedings-
local.uhasselt.internationalno-
item.fullcitationMijnendonckx, Yara; OVERDULVE, Kristof & MICHIELS, Nick (2025) Leveraging Transfer Learning for Niche Sign System Recognition in VR Training with Limited Data. In: Extended Reality: Proceedings, p. 58 -71.-
item.fulltextWith Fulltext-
item.contributorMijnendonckx, Yara-
item.contributorOVERDULVE, Kristof-
item.contributorMICHIELS, Nick-
item.accessRightsRestricted Access-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
978-3-031-97763-3 kopie copy.pdf
  Restricted Access
Published version4.9 MBAdobe PDFView/Open    Request a copy
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.