咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Beyond graph neural networks w... 收藏

Beyond graph neural networks with lifted relational neural networks

在图以外有提起的关系神经网络的神经网络

作     者:Sourek, Gustav Zelezny, Filip Kuzelka, Ondrej 

作者机构:Czech Tech Univ Fac Elect Engn Dept Comp Sci Prague Czech Republic 

出 版 物:《MACHINE LEARNING》 (机器学习)

年 卷 期:2021年第110卷第7期

页      面:1695-1738页

核心收录:

学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:Czech Science Foundation [20-19104Y, 20-29260S] CERIT Scientific Cloud under the programme "Projects of Large Research, Development, and Innovations Infrastructures" [LM2015085] 

主  题:Graph neural networks Lifted relational neural networks Symmetries Datalog Differentiable programming Relational learning Molecule classification 

摘      要:We introduce a declarative differentiable programming framework, based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode deep relational learning scenarios through the underlying symmetries. When presented with relational data, such as various forms of graphs, the logic program interpreter dynamically unfolds differentiable computation graphs to be used for the program parameter optimization by standard means. Following from the declarative, relational logic-based encoding, this results into a unified representation of a wide range of neural models in the form of compact and elegant learning programs, in contrast to the existing procedural approaches operating directly on the computational graph level. We illustrate how this idea can be used for a concise encoding of existing advanced neural architectures, with the main focus on Graph Neural Networks (GNNs). Importantly, using the framework, we also show how the contemporary GNN models can be easily extended towards higher expressiveness in various ways. In the experiments, we demonstrate correctness and computation efficiency through comparison against specialized GNN frameworks, while shedding some light on the learning performance of the existing GNN models.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分