咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Distributed differentially-pri... 收藏

Distributed differentially-private learning with communication efficiency

作     者:Phuong, Tran Thi Phong, Le Trieu 

作者机构:Meiji Univ Tokyo Kanagawa 2148571 Japan Natl Inst Informat & Commun Technol NICT Tokyo 1848795 Japan 

出 版 物:《JOURNAL OF SYSTEMS ARCHITECTURE》 (系统结构杂志)

年 卷 期:2022年第128卷

核心收录:

学科分类:08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:JST CREST  Japan [JPMJCR21M1] 

主  题:Data privacy in IoT Distributed stochastic gradient descent Communication efficiency 

摘      要:In this paper, we propose a new algorithm for learning over distributed data such as in the IoT environment, in a privacy-preserving way. Our algorithm is a differentially private variant of distributed synchronous stochastic gradient descent method with multiple workers and one parameter server, and has the following two features: (1) each distributed worker only needs to send as small as O(1) gradients in each iteration, so that the communication from worker to server is modest;(2) the dataset of each worker is protected quantitatively by differential privacy. We mathematically prove the convergence of our algorithm, and experimentally verify that it converges and reaches standard testing results on a benchmark dataset, while simultaneously maintaining reasonable privacy budgets. Our results address two equally important issues in the IoT environment, communication efficiency and differential privacy, and potentially help reducing the tension caused by the issues.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分