Please use this identifier to cite or link to this item: http://hdl.handle.net/1942/38990
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGEERTS, Floris-
dc.contributor.authorSTEEGMANS, Jasper-
dc.contributor.authorVAN DEN BUSSCHE, Jan-
dc.date.accessioned2022-12-05T12:48:50Z-
dc.date.available2022-12-05T12:48:50Z-
dc.date.issued2022-
dc.date.submitted2022-11-30T14:16:12Z-
dc.identifier.citationVarzinczak, Ivan (Ed.). Foundations of Information and Knowledge Systems 12th International Symposium, FoIKS 2022 Helsinki, Finland, June 20–23, 2022 Proceedings, SPRINGER INTERNATIONAL PUBLISHING AG, p. 20 -34-
dc.identifier.isbn978-3-031-11320-8-
dc.identifier.isbn978-3-031-11321-5-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/1942/38990-
dc.description.abstractWe investigate the power of message-passing neural networks (MPNNs) in their capacity to transform the numerical features stored in the nodes of their input graphs. Our focus is on global expressive power, uniformly over all input graphs, or over graphs of bounded degree with features from a bounded domain. Accordingly, we introduce the notion of a global feature map transformer (GFMT). As a yardstick for expressiveness, we use a basic language for GFMTs, which we call MPLang. Every MPNN can be expressed in MPLang, and our results clarify to which extent the converse inclusion holds. We consider exact versus approximate expressiveness; the use of arbitrary activation functions; and the case where only the ReLU activation function is allowed.-
dc.language.isoen-
dc.publisherSPRINGER INTERNATIONAL PUBLISHING AG-
dc.relation.ispartofseriesLecture Notes in Computer Science-
dc.subjectComputer Science - Artificial Intelligence-
dc.subjectComputer Science - Artificial Intelligence-
dc.subjectComputer Science - Learning-
dc.subject.otherClosure under concatenation-
dc.subject.otherSemiring provenance semantics for modal logic-
dc.subject.otherQuery languages for numerical data-
dc.titleOn the Expressive Power of Message-Passing Neural Networks as Global Feature Map Transformers-
dc.typeProceedings Paper-
local.bibliographicCitation.authorsVarzinczak, Ivan-
local.bibliographicCitation.conferencedateJune 20–23, 2022-
local.bibliographicCitation.conferencenameFoundations of Information and Knowledge Systems 12th International Symposium, FoIKS 2022-
local.bibliographicCitation.conferenceplaceHelsinki, Finland-
dc.identifier.epage34-
dc.identifier.spage20-
local.bibliographicCitation.jcatC1-
local.publisher.placeGEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND-
local.type.refereedRefereed-
local.type.specifiedProceedings Paper-
dc.identifier.doi10.1007/978-3-031-11321-5_2-
dc.identifier.arxiv2203.09555-
dc.identifier.isiWOS:000883026400002-
dc.identifier.eissn1611-3349-
local.provider.typeWeb of Science-
local.bibliographicCitation.btitleFoundations of Information and Knowledge Systems 12th International Symposium, FoIKS 2022 Helsinki, Finland, June 20–23, 2022 Proceedings-
local.uhasselt.internationalno-
item.validationecoom 2023-
item.accessRightsOpen Access-
item.fullcitationGEERTS, Floris; STEEGMANS, Jasper & VAN DEN BUSSCHE, Jan (2022) On the Expressive Power of Message-Passing Neural Networks as Global Feature Map Transformers. In: Varzinczak, Ivan (Ed.). Foundations of Information and Knowledge Systems 12th International Symposium, FoIKS 2022 Helsinki, Finland, June 20–23, 2022 Proceedings, SPRINGER INTERNATIONAL PUBLISHING AG, p. 20 -34.-
item.fulltextWith Fulltext-
item.contributorGEERTS, Floris-
item.contributorSTEEGMANS, Jasper-
item.contributorVAN DEN BUSSCHE, Jan-
Appears in Collections:Research publications
Files in This Item:
File Description SizeFormat 
Pages from 978-3-031-11321-5.pdf
  Restricted Access
Published version272.06 kBAdobe PDFView/Open    Request a copy
helsinki.pdfPeer-reviewed author version273.91 kBAdobe PDFView/Open
Show simple item record

WEB OF SCIENCETM
Citations

2
checked on May 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.