Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/36204
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMORALES HERNANDEZ, Alejandro-
dc.contributor.authorNAPOLES RUIZ, Gonzalo-
dc.contributor.authorJastrzebska, Agnieszka-
dc.contributor.authorSalgueiro, Yamisleydi-
dc.contributor.authorVANHOOF, Koen-
dc.date.accessioned2021-12-15T11:11:41Z-
dc.date.available2021-12-15T11:11:41Z-
dc.date.issued2021-
dc.date.submitted2021-12-09T10:08:11Z-
dc.identifier.citationEXPERT SYSTEMS WITH APPLICATIONS, 205 , (Art. N° 117721)-
dc.identifier.issn0957-4174-
dc.identifier.urihttp://hdl.handle.net/1942/36204-
dc.description.abstractForecasting windmill time series is often the basis of other processes such as anomaly detection, health monitoring, or maintenance scheduling. The amount of data generated by windmill farms makes online learning the most viable strategy to follow. Such settings require retraining the model each time a new batch of data is available. However, updating the model with new information is often very expensive when using traditional Recurrent Neural Networks (RNNs). In this paper, we use Long Short-term Cognitive Networks (LSTCNs) to forecast windmill time series in online settings. These recently introduced neural systems consist of chained Short-term Cognitive Network blocks, each processing a temporal data chunk. The learning algorithm of these blocks is based on a very fast, deterministic learning rule that makes LSTCNs suitable for online learning tasks. The numerical simulations using a case study involving four windmills showed that our approach reported the lowest forecasting errors with respect to a simple RNN, a Long Short-term Memory, a Gated Recurrent Unit, and a Hidden Markov Model. What is perhaps more important is that the LSTCN approach is significantly faster than these state-of-the-art models.-
dc.language.isoen-
dc.publisherPERGAMON-ELSEVIER SCIENCE LTD-
dc.rightsThis is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).-
dc.subject.otherlong short-term cognitive network-
dc.subject.otherrecurrent neural network-
dc.subject.othermultivariate time series-
dc.subject.otherforecasting-
dc.titleOnline learning of windmill time series using Long Short-term Cognitive Networks-
dc.typeJournal Contribution-
dc.identifier.volume205-
local.bibliographicCitation.jcatA1-
local.publisher.placeTHE BOULEVARD, LANGFORD LANE, KIDLINGTON, OXFORD OX5 1GB, ENGLAND-
local.type.refereedRefereed-
local.type.specifiedArticle-
local.bibliographicCitation.artnr117721-
dc.identifier.doi10.1016/j.eswa.2022.117721-
dc.identifier.arxivhttps://arxiv.org/abs/2107.00425-
dc.identifier.isi000832961500005-
dc.identifier.eissn1873-6793-
local.provider.typePdf-
local.uhasselt.uhpubyes-
local.dataset.urlhttps://opendata-renewables.engie.com/explore/index-
local.uhasselt.internationalyes-
item.contributorMORALES HERNANDEZ, Alejandro-
item.contributorNAPOLES RUIZ, Gonzalo-
item.contributorJastrzebska, Agnieszka-
item.contributorSalgueiro, Yamisleydi-
item.contributorVANHOOF, Koen-
item.validationecoom 2023-
item.fullcitationMORALES HERNANDEZ, Alejandro; NAPOLES RUIZ, Gonzalo; Jastrzebska, Agnieszka; Salgueiro, Yamisleydi & VANHOOF, Koen (2021) Online learning of windmill time series using Long Short-term Cognitive Networks. In: EXPERT SYSTEMS WITH APPLICATIONS, 205 , (Art. N° 117721).-
item.accessRightsOpen Access-
item.fulltextWith Fulltext-
crisitem.journal.issn0957-4174-
crisitem.journal.eissn1873-6793-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
revised_manuscript_CLEAN.pdfNon Peer-reviewed author version1.32 MBAdobe PDFView/Open
1-s2.0-S0957417422010065-main.pdfPublished version1.42 MBAdobe PDFView/Open
Show simple item record

WEB OF SCIENCETM
Citations

1
checked on Apr 24, 2024

Page view(s)

42
checked on Aug 18, 2022

Download(s)

4
checked on Aug 18, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.