Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/39031
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNAPOLES RUIZ, Gonzalo-
dc.contributor.authorGrau, I.-
dc.contributor.authorCONCEPCION PEREZ, Leonardo-
dc.contributor.authorSalgueiro, Yamisleydi-
dc.date.accessioned2022-12-14T14:45:59Z-
dc.date.available2022-12-14T14:45:59Z-
dc.date.issued2021-
dc.date.submitted2022-12-06T14:19:37Z-
dc.identifier.citationProceedings of the 11th International Conference on Pattern Recognition Systems, Institution of Engineering and Technology, p. 25 -30-
dc.identifier.isbn978-1-83953-430-0-
dc.identifier.urihttp://hdl.handle.net/1942/39031-
dc.description.abstractLong-term Cognitive Networks (LTCNs) are recurrent neural networks for modeling and simulation. Such networks can be trained in a synaptic or non-synaptic mode according to their goal. Non-synaptic learning refers to adjusting the transfer function parameters while preserving the weights connecting the neurons. In that regard, the Non-synaptic Backpropagation (NSBP) algorithm has proven successful in training LTCN-based models. Despite NSBP’s success, a question worthy of investigation is whether the backpropagation process is necessary when training these recurrent neural networks. This paper investigates this issue and presents three non-synaptic learning methods that modify the original algorithm. In addition, we perform a sensitivity analysis of both the NSBP’s hyperparameters and the LTCNs’ learnable parameters. The main conclusions of our study are i) the backward process attached to the NSBP algorithm is not necessary to train these recurrent neural systems, and ii) there is a non-synaptic learnable parameter that does not contribute significantly to the LTCNs’ performance.-
dc.language.isoen-
dc.publisherInstitution of Engineering and Technology-
dc.subject.otherNSBP algorithm-
dc.subject.otherrecurrent neural systems-
dc.subject.othernonsynaptic learnable parameter-
dc.subject.othersynaptic mode-
dc.subject.othernonsynaptic mode-
dc.subject.othertransfer function parameters-
dc.subject.othernonsynaptic backpropagation algorithm-
dc.subject.otherLTCN based models-
dc.subject.otherbackpropagation process-
dc.subject.otherrecurrent neural networks-
dc.subject.othernonsynaptic learning-
dc.subject.otherhyperparameters-
dc.subject.otherlong-term cognitive networks-
dc.subject.otherlearnable parameters-
dc.titleOn the Performance of the Nonsynaptic Backpropagation for Training Long-term Cognitive Networks-
dc.typeProceedings Paper-
local.bibliographicCitation.conferencedate17/03/21 → 19/03/21-
local.bibliographicCitation.conferencename11th International Conference on Pattern Recognition Systems (ICPRS 2021)-
local.bibliographicCitation.conferenceplaceCurico, Chile-
dc.identifier.epage30-
dc.identifier.spage25-
local.format.pages6-
local.bibliographicCitation.jcatC1-
local.type.refereedRefereed-
local.type.specifiedProceedings Paper-
dc.identifier.doi10.1049/icp.2021.1434-
local.provider.typeCrossRef-
local.bibliographicCitation.btitleProceedings of the 11th International Conference on Pattern Recognition Systems-
local.uhasselt.internationalyes-
item.fulltextWith Fulltext-
item.accessRightsRestricted Access-
item.fullcitationNAPOLES RUIZ, Gonzalo; Grau, I.; CONCEPCION PEREZ, Leonardo & Salgueiro, Yamisleydi (2021) On the Performance of the Nonsynaptic Backpropagation for Training Long-term Cognitive Networks. In: Proceedings of the 11th International Conference on Pattern Recognition Systems, Institution of Engineering and Technology, p. 25 -30.-
item.contributorNAPOLES RUIZ, Gonzalo-
item.contributorGrau, I.-
item.contributorCONCEPCION PEREZ, Leonardo-
item.contributorSalgueiro, Yamisleydi-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
On_the_Performance_of_the_Nonsynaptic_Backpropagation_for_Training_Long-term_Cognitive_Networks.pdf
  Restricted Access
Published version164.02 kBAdobe PDFView/Open    Request a copy
Show simple item record

Page view(s)

46
checked on Aug 6, 2023

Download(s)

10
checked on Aug 6, 2023

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.