咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Dynamic self-training with les... 收藏

Dynamic self-training with less uncertainty for graph imbalance learning

作     者:Juan, Xin Peng, Meixin Wang, Xin 

作者机构:Jilin Univ Coll Artificial Intelligence Changchun 130012 Peoples R China Naval Univ Engn Natl Key Lab Sci & Technol Vessel Integrated Power Wuhan Peoples R China 

出 版 物:《EXPERT SYSTEMS WITH APPLICATIONS》 (Expert Sys Appl)

年 卷 期:2025年第271卷

核心收录:

学科分类:1201[管理学-管理科学与工程(可授管理学、工学学位)] 0808[工学-电气工程] 08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:National Natural Science Foundation of China [62372211, 62272191] Foundation of the National Key Research and Development of China [2021ZD0112500] International Science and Tech-nology Cooperation Program of Jilin Province [20230402076GH, 20240402067GH] Science and Technology Development Program of Jilin Province [20220201153GX] 

主  题:Imbalanced node classification Bayesian graph neural networks Pseudo-labels Uncertainty estimation 

摘      要:Graph Neural Networks (GNNs) have drawn much attention inessential graph-structured applications. Most prevailing models are founded on the assumption that class distribution within the training set is balanced. However, data collected from real-world scenarios often exhibit long-tailed distributions. Since the loss is dominated by the nodes with majority classes in the objective function of training and the nodes with minority classes have less engagement in the message-passing mechanism, GNN s performance on imbalanced datasets is undoubtedly unsatisfactory. To alleviate above-mentioned issue, this paper focuses on graph imbalance learning from the quantitative and topological perspectives, and correspondingly proposes a novel dynamic self-training with less uncertainty framework, DeLU-BGNN. Specifically, a self-training mechanism combining with a Bayesian Graph Neural Network is adopted in our DeLU-BGNN framework to assign high confidence pseudo-labels to the nodes with minority classes in the unlabeled set. For solving quantity imbalance, DeLUBGNN augments the nodes with minority classes to the labeled set and dynamically re-balances the class distribution of the training set. For topology imbalance, topology optimization is utilized in DeLU-BGNN to facilitate the propagation of minority nodes during the message-passing process. Extensive experiments conducted on various real-world datasets demonstrate the superiority of our proposed DeLU-BGNN framework in handling the imbalanced node classification problem.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分