Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/34695
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNAPOLES RUIZ, Gonzalo-
dc.contributor.authorBELLO GARCIA, Marilyn-
dc.contributor.authorSalgueiro, Yamisleydi-
dc.date.accessioned2021-08-20T13:08:44Z-
dc.date.available2021-08-20T13:08:44Z-
dc.date.issued2021-
dc.date.submitted2021-08-17T23:56:11Z-
dc.identifier.citationNeural networks (Print), 140 , p. 39 -48-
dc.identifier.issn0893-6080-
dc.identifier.urihttp://hdl.handle.net/1942/34695-
dc.description.abstractThis paper presents a neural system to deal with multi-label classification problems that might involve sparse features. The architecture of this model involves three sequential blocks with well-defined functions. The first block consists of a multilayered feed-forward structure that extracts hidden features, thus reducing the problem dimensionality. This block is useful when dealing with sparse problems. The second block consists of a Long-term Cognitive Network-based model that operates on features extracted by the first block. The activation rule of this recurrent neural network is modified to prevent the vanishing of the input signal during the recurrent inference process. The modified activation rule combines the neurons' state in the previous abstract layer (iteration) with the initial state. Moreover, we add a bias component to shift the transfer functions as needed to obtain good approximations. Finally, the third block consists of an output layer that adapts the second block's outputs to the label space. We propose a backpropagation learning algorithm that uses a squared hinge loss function to maximize the margins between labels to train this network. The results show that our model outperforms the state-of-the-art algorithms in most datasets.-
dc.description.sponsorshipThe authors would like to sincerely thank Isel Grau from the Vrije Universiteit Brussel, Belgium, who pointed out the advantages of using the squared hinge function instead of the mean squared error. This paper was partially supported by the Program CONICYT FONDECYT de Postdoctorado, Chile through the project 3200284.-
dc.language.isoen-
dc.publisherPERGAMON-ELSEVIER SCIENCE LTD-
dc.rights2021 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).-
dc.subject.otherLong-term cognitive networks-
dc.subject.otherRecurrent neural networks-
dc.subject.otherBackpropagation-
dc.subject.otherMulti-label classification-
dc.titleLong-term Cognitive Network-based architecture for multi-label classification-
dc.typeJournal Contribution-
dc.identifier.epage48-
dc.identifier.spage39-
dc.identifier.volume140-
local.bibliographicCitation.jcatA1-
local.publisher.placeTHE BOULEVARD, LANGFORD LANE, KIDLINGTON, OXFORD OX5 1GB, ENGLAND-
local.type.refereedRefereed-
local.type.specifiedArticle-
dc.identifier.doi10.1016/j.neunet.2021.03.001-
dc.identifier.isiWOS:000652749900004-
dc.identifier.eissn1879-2782-
local.provider.typePdf-
local.uhasselt.uhpubyes-
local.uhasselt.internationalyes-
item.validationecoom 2022-
item.contributorNAPOLES RUIZ, Gonzalo-
item.contributorBELLO GARCIA, Marilyn-
item.contributorSalgueiro, Yamisleydi-
item.fulltextWith Fulltext-
item.accessRightsOpen Access-
item.fullcitationNAPOLES RUIZ, Gonzalo; BELLO GARCIA, Marilyn & Salgueiro, Yamisleydi (2021) Long-term Cognitive Network-based architecture for multi-label classification. In: Neural networks (Print), 140 , p. 39 -48.-
crisitem.journal.issn0893-6080-
crisitem.journal.eissn1879-2782-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
1-s2.0-S0893608021000812-main.pdfPublished version881.68 kBAdobe PDFView/Open
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.