Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/45763
Full metadata record
DC FieldValueLanguage
dc.contributor.authorVAN DEN BOGAART, Maud-
dc.contributor.authorJACOBS, Nina-
dc.contributor.authorHallemans, Ann-
dc.contributor.authorMEYNS, Pieter-
dc.date.accessioned2025-03-31T12:52:09Z-
dc.date.available2025-03-31T12:52:09Z-
dc.date.issued2025-
dc.date.submitted2025-03-21T13:54:14Z-
dc.identifier.citationApplied Sciences, 15 (7) (Art N° 3428)-
dc.identifier.issn2076-3417-
dc.identifier.urihttp://hdl.handle.net/1942/45763-
dc.description.abstractProprioceptive deficits can lead to impaired motor performance. Therefore, accurately measuring proprioceptive function in order to identify deficits as soon as possible is important. Techniques based on deep learning to track body landmarks in simple video recordings are promising to assess proprioception (joint position sense) during joint position reproduction (JPR) tests in clinical settings, outside the laboratory and without the need to attach markers. Fifteen typically developing children participated in 90 knee JPR trials and 21 typically developing children participated in 126 hip JPR trials. Concurrent validity of two-dimensional deep-learning-based motion capture (DeepLabCut) to measure the Joint Reproduction Error (JRE) with respect to laboratory-based optoelectronic three-dimensional motion capture (Vicon motion capture system, gold standard) was assessed. There was no significant difference in the hip and knee JRE measured with DeepLabCut and Vicon. Two-dimensional deep-learning-based motion capture (DeepLabCut) is valid to assess proprioception with respect to the gold standard in typically developing children. Tools based on deep learning, such as DeepLabCut, make it possible to accurately measure joint angles in order to assess proprioception without the need of a laboratory and to attach markers, with a high level of automatization.-
dc.description.sponsorshipFunding: NJ and this work were supported by the Research Foundation—Flanders (FWO) (grant.number: 92836, 2021) and the Special Research Fund (BOF) for Small Research Project—Hasselt University (BOF19KP08), respectively. Acknowledgments: The authors would like to thank all children and parents who volunteered and participated in this study and the school and master’s students who collaborated and assisted with the recruitment of the children.-
dc.language.isoen-
dc.rights2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/ licenses/by/4.0/).-
dc.subject.otherproprioception-
dc.subject.otherjoint position sense-
dc.subject.othervalidity-
dc.subject.otherdeep-learning-based motion capture-
dc.subject.otherpose estimation-
dc.subject.otherDeepLabCut-
dc.titleValidity of Deep Learning-Based Motion Capture Using DeepLabCut to Assess Proprioception in Children-
dc.typeJournal Contribution-
dc.identifier.issue7-
dc.identifier.spage3428-
dc.identifier.volume15-
local.bibliographicCitation.jcatA1-
local.type.refereedRefereed-
local.type.specifiedArticle-
local.bibliographicCitation.artnr3428-
dc.identifier.doi10.3390/app15073428-
dc.identifier.isi001463693900001-
local.provider.typePdf-
local.dataset.urlhttps://www.mdpi.com/article/10.3390/app15073428/s1-
local.uhasselt.internationalno-
item.contributorVAN DEN BOGAART, Maud-
item.contributorJACOBS, Nina-
item.contributorHallemans, Ann-
item.contributorMEYNS, Pieter-
item.fullcitationVAN DEN BOGAART, Maud; JACOBS, Nina; Hallemans, Ann & MEYNS, Pieter (2025) Validity of Deep Learning-Based Motion Capture Using DeepLabCut to Assess Proprioception in Children. In: Applied Sciences, 15 (7) (Art N° 3428).-
item.fulltextWith Fulltext-
item.accessRightsOpen Access-
crisitem.journal.eissn2076-3417-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
applsci-15-03428.pdfPublished version1.65 MBAdobe PDFView/Open
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.