| |||||||
ShanghaiTech University Knowledge Management System
Wyner-Ziv Gradient Compression for Federated Learning | |
2021-11-16 | |
状态 | 已发表 |
摘要 | Due to limited communication resources at the client and a massive number of model parameters, large-scale distributed learning tasks suffer from communication bottleneck. Gradient compression is an effective method to reduce communication load by transmitting compressed gradients. Motivated by the fact that in the scenario of stochastic gradients descent, gradients between adjacent rounds may have a high correlation since they wish to learn the same model, this paper proposes a practical gradient compression scheme for federated learning, which uses historical gradients to compress gradients and is based on Wyner-Ziv coding but without any probabilistic assumption. We also implement our gradient quantization method on the real dataset, and the performance of our method is better than the previous schemes. |
关键词 | federated learning side information gradient compression |
DOI | arXiv:2111.08277 |
相关网址 | 查看原文 |
出处 | Arxiv |
WOS记录号 | PPRN:11958153 |
WOS类目 | Computer Science, Artificial Intelligence |
资助项目 | National Nature Science Foundation of China (NSFC) under Grant[61901267] |
文献类型 | 预印本 |
条目标识符 | https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/348424 |
专题 | 信息科学与技术学院_博士生 信息科学与技术学院_PI研究组_吴幼龙组 信息科学与技术学院_硕士生 |
作者单位 | 1.ShanghaiTech Univ, Shanghai, Peoples R China 2.Chinese Acad Sci, Shanghai Inst Microsyst & Informat Technol, Shanghai, Peoples R China 3.Univ Chinese Acad Sci, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Liang, Kai,Zhong, Huiru,Chen, Haoning,et al. Wyner-Ziv Gradient Compression for Federated Learning. 2021. |
条目包含的文件 | ||||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 |
修改评论
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。