版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Department of Computer Science and Information Engineering National Taiwan University Taipei10617 Taiwan Graduate Institute of Networking and Multimedia National Taiwan University Taipei10617 Taiwan
出 版 物:《Information (Switzerland)》 (Information)
年 卷 期:2023年第14卷第4期
页 面:234页
核心收录:
学科分类:0710[理学-生物学] 0817[工学-化学工程与技术] 08[工学] 0703[理学-化学] 0835[工学-软件工程] 0714[理学-统计学(可授理学、经济学学位)] 0836[工学-生物工程] 0701[理学-数学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:This research was funded by Minister of Science and Technology Taiwan MOST 109-2218-E-002-015 and MOST 111-2221-E-002-134-MY3. And The APC was funded by MPDI
主 题:Distillation
摘 要:Recently, federated learning (FL) has gradually become an important research topic in machine learning and information theory. FL emphasizes that clients jointly engage in solving learning tasks. In addition to data security issues, fundamental challenges in this type of learning include the imbalance and non-IID among clients’ data and the unreliable connections between devices due to limited communication bandwidths. The above issues are intractable to FL. This study starts from the uncertainty analysis of deep neural networks (DNNs) to evaluate the effectiveness of FL, and proposes a new architecture for model aggregation. Our scheme improves FL’s performance by applying knowledge distillation and the DNN’s uncertainty quantification methods. A series of experiments on the image classification task confirms that our proposed model aggregation scheme can effectively solve the problem of non-IID data, especially when affordable transmission costs are limited. © 2023 by the authors.