Please use this identifier to cite or link to this item:
http://hdl.handle.net/1942/29330
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | NAPOLES RUIZ, Gonzalo | - |
dc.contributor.author | VAN HOUDT, Greg | - |
dc.date.accessioned | 2019-09-17T08:27:31Z | - |
dc.date.available | 2019-09-17T08:27:31Z | - |
dc.date.issued | 2019 | - |
dc.identifier.uri | http://hdl.handle.net/1942/29330 | - |
dc.description.abstract | Long Short-Term Memory (LSTM) has transformed both machine learning and neurocomputing fields. According to the webpage of one of the LSTM's fathers – Prof. Jürgen Schmidhuber – this model improved speech recognition on over 2 billion Android phones, greatly improved machine translation through Google Translate, and the answers of Amazon's Alexa. Interestingly, recurrent neural networks had shown a rather discrete performance until LSTM showed up. One reason for the success of this recurrent network lies in its ability to handle the exploding and vanishing gradient problem, which stands as a difficult issue to be circumvented when training recurrent or very deep neural networks. In this paper, we present a comprehensible review that goes over both theory and practice. To start, LSTM’s formulation and training is briefly described. However, as this theory was recently reviewed in the literature, the second part is the most elaborate, covering relevant applications reported in the literature. From this study, we learned that LSTM is a very suitable model to tackle, among others, time series prediction, text recognition and natural language processing. The applications also showed how LSTM can work together with other models to create hybrid deep learning architectures. This is done to increase performance, as vanilla LSTM is not always the ideal answer. Finally, we conclude with code resources implementing the neural system for a toy example, showing how easy it is to run the model on your home computer. | - |
dc.format.mimetype | Application/pdf | - |
dc.language | nl | - |
dc.publisher | UHasselt | - |
dc.title | Deep Learning: the power behind the Long Short-term Memory model | - |
dc.type | Theses and Dissertations | - |
local.format.pages | 0 | - |
local.bibliographicCitation.jcat | T2 | - |
dc.description.notes | master in de toegepaste economische wetenschappen: handelsingenieur in de beleidsinformatica | - |
local.type.specified | Master thesis | - |
item.fulltext | With Fulltext | - |
item.accessRights | Open Access | - |
item.contributor | VAN HOUDT, Greg | - |
item.fullcitation | VAN HOUDT, Greg (2019) Deep Learning: the power behind the Long Short-term Memory model. | - |
Appears in Collections: | Master theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ccb6397d-f73f-4512-8b9a-a4113ed8a209.pdf | 627.32 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.