消息
×
loading..
Self-attention Dual Embedding for Graphs with Heterophily
2023-05-28
状态已发表
摘要

Graph Neural Networks (GNNs) have been highly successful for the node classification task. GNNs typically assume graphs are homophilic, i.e. neighboring nodes are likely to belong to the same class. However, a number of real-world graphs are heterophilic, and this leads to much lower classification accuracy using standard GNNs. In this work, we design a novel GNN which is effective for both heterophilic and homophilic graphs. Our work is based on three main observations. First, we show that node features and graph topology provide different amounts of informativeness in different graphs, and therefore they should be encoded independently and prioritized in an adaptive manner. Second, we show that allowing negative attention weights when propagating graph topology information improves accuracy. Finally, we show that asymmetric attention weights between nodes are helpful. We design a GNN which makes use of these observations through a novel self-attention mechanism. We evaluate our algorithm on real-world graphs containing thousands to millions of nodes and show that we achieve state-of-the-art results compared to existing GNNs. We also analyze the effectiveness of the main components of our design on different graphs.

DOIarXiv:2305.18385
相关网址查看原文
出处Arxiv
WOS记录号PPRN:72763947
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Information Systems
文献类型预印本
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/348075
专题信息科学与技术学院
信息科学与技术学院_PI研究组_范睿组
信息科学与技术学院_硕士生
作者单位
Shanghai Tech Univ, Sch Informat Sci & Technol, Shanghai, Peoples R China
推荐引用方式
GB/T 7714
Lai, Yurui,Zhang, Taiyan,Fan, Rui. Self-attention Dual Embedding for Graphs with Heterophily. 2023.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Lai, Yurui]的文章
[Zhang, Taiyan]的文章
[Fan, Rui]的文章
百度学术
百度学术中相似的文章
[Lai, Yurui]的文章
[Zhang, Taiyan]的文章
[Fan, Rui]的文章
必应学术
必应学术中相似的文章
[Lai, Yurui]的文章
[Zhang, Taiyan]的文章
[Fan, Rui]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。