Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/31139
Full metadata record
DC FieldValueLanguage
dc.contributor.authorVAN HOUDT, Greg-
dc.contributor.authorMosquera, Carlos-
dc.contributor.authorNAPOLES RUIZ, Gonzalo-
dc.date.accessioned2020-05-18T11:38:03Z-
dc.date.available2020-05-18T11:38:03Z-
dc.date.issued2020-
dc.date.submitted2020-05-15T09:05:26Z-
dc.identifier.citationARTIFICIAL INTELLIGENCE REVIEW, 53, p. 5929-5955-
dc.identifier.issn0269-2821-
dc.identifier.urihttp://hdl.handle.net/1942/31139-
dc.description.abstractLong Short-Term Memory (LSTM) has transformed both machine learning and neurocomputing fields. According to several online sources, this model has improved Google's speech recognition, greatly improved machine translations on Google Translate, and the answers of Amazon's Alexa. This neural system is also employed by Facebook, reaching over 4 billion LSTM-based translations per day as of 2017. Interestingly, recurrent neural networks had shown a rather discrete performance until LSTM showed up. One reason for the success of this recurrent network lies in its ability to handle the exploding / vanishing gradient problem, which stands as a difficult issue to be circumvented when training recurrent or very deep neural networks. In this paper, we present a comprehensive review that covers LSTM's formulation and training, relevant applications reported in the literature and code resources implementing this model for a toy example.-
dc.language.isoen-
dc.publisherSpringer Nature-
dc.rights© Springer Nature B.V. 2020-
dc.subject.otherRecurrent neural networks-
dc.subject.otherVanishing/exploding gradient-
dc.subject.otherLong short-term memory-
dc.subject.otherDeep learning-
dc.titleA review on the long short-term memory model-
dc.typeJournal Contribution-
dc.identifier.epage5955-
dc.identifier.spage5929-
dc.identifier.volume53-
local.bibliographicCitation.jcatA1-
local.publisher.placeVAN GODEWIJCKSTRAAT 30, 3311 GZ DORDRECHT, NETHERLANDS-
local.type.refereedRefereed-
local.type.specifiedReview-
dc.identifier.doi10.1007/s10462-020-09838-1-
dc.identifier.isiWOS:000532798400001-
dc.identifier.eissn1573-7462-
local.provider.typeCrossRef-
local.uhasselt.uhpubyes-
item.contributorVAN HOUDT, Greg-
item.contributorMosquera, Carlos-
item.contributorNAPOLES RUIZ, Gonzalo-
item.accessRightsOpen Access-
item.fullcitationVAN HOUDT, Greg; Mosquera, Carlos & NAPOLES RUIZ, Gonzalo (2020) A review on the long short-term memory model. In: ARTIFICIAL INTELLIGENCE REVIEW, 53, p. 5929-5955.-
item.fulltextWith Fulltext-
item.validationecoom 2021-
crisitem.journal.issn0269-2821-
crisitem.journal.eissn1573-7462-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
A_review_on_the_Long_short_term_memory_model.pdfPeer-reviewed author version351 kBAdobe PDFView/Open
VanHoudt2020_Article_AReviewOnTheLongShort-termMemo.pdf
  Restricted Access
Published version506.22 kBAdobe PDFView/Open    Request a copy
Show simple item record

WEB OF SCIENCETM
Citations

455
checked on Apr 23, 2024

Page view(s)

104
checked on Sep 7, 2022

Download(s)

1,030
checked on Sep 7, 2022

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.