Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/32780
Title: Age and sex affect deep learning prediction of cardiometabolic risk factors from retinal images
Authors: Gerrits, Nele
Elen, Bart
Van Craenendonck, Toon
Triantafyllidou, Danai
Petropoulos, Ioannis N.
Malik, Rayaz A.
DE BOEVER, Patrick 
Issue Date: 2020
Publisher: NATURE PUBLISHING GROUP
Source: Scientific Reports, 10 (1) (Art N° 9432)
Abstract: Deep neural networks can extract clinical information, such as diabetic retinopathy status and individual characteristics (e.g. age and sex), from retinal images. Here, we report the first study to train deep learning models with retinal images from 3,000 Qatari citizens participating in the Qatar Biobank study. We investigated whether fundus images can predict cardiometabolic risk factors, such as age, sex, blood pressure, smoking status, glycaemic status, total lipid panel, sex steroid hormones and bioimpedance measurements. Additionally, the role of age and sex as mediating factors when predicting cardiometabolic risk factors from fundus images was studied. Predictions at person-level were made by combining information of an optic disc centred and a macula centred image of both eyes with deep learning models using the MobileNet-V2 architecture. An accurate prediction was obtained for age (mean absolute error (MAE): 2.78 years) and sex (area under the curve: 0.97), while an acceptable performance was achieved for systolic blood pressure (MAE: 8.96mmHg), diastolic blood pressure (MAE: 6.84mmHg), Haemoglobin A1c (MAE: 0.61%), relative fat mass (MAE: 5.68 units) and testosterone (MAE: 3.76 nmol/L). We discovered that age and sex were mediating factors when predicting cardiometabolic risk factors from fundus images. We have found that deep learning models indirectly predict sex when trained for testosterone. For blood pressure, Haemoglobin A1c and relative fat mass an influence of age and sex was observed. However, achieved performance cannot be fully explained by the influence of age and sex. In conclusion we confirm that age and sex can be predicted reliably from a fundus image and that unique information is stored in the retina that relates to blood pressure, Haemoglobin A1c and relative fat mass. Future research should focus on stratification when predicting person characteristics from a fundus image.
Notes: Gerrits, N (corresponding author), VITO NV, Unit Hlth, Mol, Belgium.
nele.gerrits@vito.be
Other: Gerrits, N (corresponding author), VITO NV, Unit Hlth, Mol, Belgium. nele.gerrits@vito.be
Document URI: http://hdl.handle.net/1942/32780
ISSN: 2045-2322
e-ISSN: 2045-2322
DOI: 10.1038/s41598-020-65794-4
ISI #: WOS:000560478900048
Rights: Te Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. Te images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
Category: A1
Type: Journal Contribution
Validations: ecoom 2021
Appears in Collections:Research publications

Files in This Item:
File Description SizeFormat 
s41598-020-65794-4.pdfPublished version2.09 MBAdobe PDFView/Open
Show full item record

WEB OF SCIENCETM
Citations

29
checked on Apr 15, 2024

Page view(s)

18
checked on Sep 7, 2022

Download(s)

8
checked on Sep 7, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.