Learning Compact Neural Networks via Generalized Structured Sparsity
2024-10
会议录名称27TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE
发表状态已发表
摘要

Deep neural networks have shown excellent performance in various domains, but the large number of parameters and computational inefficiency pose significant challenges in practice. Existing sparse learning methods, such as pruning and regularization, play a crucial role in reducing model size and improving generalization. However, they are limited to the single-level grouping structure and ignore the correlation between consecutive layers, leading to insufficient sparsity and performance degradation. To address these challenges, we propose a novel sparsity regularizer that promotes structured sparsity based on the multi-level grouping structure. It encourages inter-group cooperation and intra-group competition at the first-level, and promotes inter-group competition and intra-group cooperation at the second-level. The multi-level grouping nature can flexibly model the correlation between consecutive layers or convolutional kernels by carefully defining the groups based on specific neural architectures. Moreover, we introduce a more general form that unifies a family of convex and non-covnex sparse regularizers and prove its equivalence to multiplicative weight decomposition, which helps us develop a simple but efficient optimization algorithm. Extensive experiments on real-world datasets show that the proposed method can generate more compact and efficient models compared to cutting-edge methods.

语种英语
文献类型会议论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/493622
专题信息科学与技术学院_硕士生
作者单位
上海科技大学
推荐引用方式
GB/T 7714
Bian K,Sun L,Zhao DJ. Learning Compact Neural Networks via Generalized Structured Sparsity[C],2024.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Bian K(卞珂)]的文章
[Sun L(孙露)]的文章
[Zhao DJ(赵登吉)]的文章
百度学术
百度学术中相似的文章
[Bian K(卞珂)]的文章
[Sun L(孙露)]的文章
[Zhao DJ(赵登吉)]的文章
必应学术
必应学术中相似的文章
[Bian K(卞珂)]的文章
[Sun L(孙露)]的文章
[Zhao DJ(赵登吉)]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。