消息
×
loading..
FedTP: Federated Learning by Transformer Personalization
2023
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (IF:10.2[JCR-2023],10.4[5-Year])
ISSN2162-237X
EISSN2162-2388
卷号PP期号:99页码:1-15
发表状态已发表
DOI10.1109/TNNLS.2023.3269062
摘要

Federated learning is an emerging learning paradigm where multiple clients collaboratively train a machine learning model in a privacy-preserving manner. Personalized federated learning extends this paradigm to overcome heterogeneity across clients by learning personalized models. Recently, there have been some initial attempts to apply transformers to federated learning. However, the impacts of federated learning algorithms on self-attention have not yet been studied. In this article, we investigate this relationship and reveal that federated averaging (FedAvg) algorithms actually have a negative impact on self-attention in cases of data heterogeneity, which limits the capabilities of the transformer model in federated learning settings. To address this issue, we propose FedTP, a novel transformer-based federated learning framework that learns personalized self-attention for each client while aggregating the other parameters among the clients. Instead of using a vanilla personalization mechanism that maintains personalized self-attention layers of each client locally, we develop a learn-to-personalize mechanism to further encourage the cooperation among clients and to increase the scalability and generalization of FedTP. Specifically, we achieve this by learning a hypernetwork on the server that outputs the personalized projection matrices of self-attention layers to generate clientwise queries, keys, and values. Furthermore, we present the generalization bound for FedTP with the learn-to-personalize mechanism. Extensive experiments verify that FedTP with the learn-to-personalize mechanism yields state-of-the-art performance in the non-IID scenarios. Our code is available online https://github.com/zhyczy/FedTP. IEEE

关键词Data privacy Job analysis Learning algorithms Learning systems Machine learning Federated learning Hypernetwork Learn+ Learn-to-personalize Personalizations Personalized federated learning Self-attention Task analysis Transformer
URL查看原文
收录类别SCI ; EI
语种英语
资助项目Shanghai Sailing Program[
WOS研究方向Computer Science ; Engineering
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:001005747100001
出版者Institute of Electrical and Electronics Engineers Inc.
EI入藏号20232314197561
EI主题词Scalability
EI分类号723.4 Artificial Intelligence ; 723.4.2 Machine Learning ; 961 Systems Science
原始文献类型Article in Press
来源库IEEE
引用统计
被引频次:30[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/312341
专题信息科学与技术学院
信息科学与技术学院_硕士生
信息科学与技术学院_PI研究组_汪婧雅组
信息科学与技术学院_PI研究组_石野组
共同第一作者Cai, Zhongyi
通讯作者Shi, Ye
作者单位
1.ShanghaiTech Univ, Sch Informat Sci & Technol, Shanghai 201210, Peoples R China
2.Nantong Univ, Sch Informat Sci & Technol, Nantong 226019, Peoples R China
3.Univ Technol Sydney, Sch Comp Sci, Broadway, NSW 2007, Australia
第一作者单位信息科学与技术学院
通讯作者单位信息科学与技术学院
第一作者的第一单位信息科学与技术学院
推荐引用方式
GB/T 7714
Li, Hongxia,Cai, Zhongyi,Wang, Jingya,et al. FedTP: Federated Learning by Transformer Personalization[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2023,PP(99):1-15.
APA Li, Hongxia.,Cai, Zhongyi.,Wang, Jingya.,Tang, Jiangnan.,Ding, Weiping.,...&Shi, Ye.(2023).FedTP: Federated Learning by Transformer Personalization.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,PP(99),1-15.
MLA Li, Hongxia,et al."FedTP: Federated Learning by Transformer Personalization".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS PP.99(2023):1-15.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Li, Hongxia]的文章
[Cai, Zhongyi]的文章
[Wang, Jingya]的文章
百度学术
百度学术中相似的文章
[Li, Hongxia]的文章
[Cai, Zhongyi]的文章
[Wang, Jingya]的文章
必应学术
必应学术中相似的文章
[Li, Hongxia]的文章
[Cai, Zhongyi]的文章
[Wang, Jingya]的文章
相关权益政策
暂无数据
收藏/分享
文件名: 10.1109@TNNLS.2023.3269062.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。