Dirichlet Process Mixture of Generalized Inverted Dirichlet Distributions for Positive Vector Data With Extended Variational Inference
2022-11-01
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (IF:10.2[JCR-2023],10.4[5-Year])
ISSN2162-2388
卷号33期号:11
发表状态已发表
DOI10.1109/TNNLS.2021.3072209
摘要A Bayesian nonparametric approach for estimation of a Dirichlet process (DP) mixture of generalized inverted Dirichlet distributions [i.e., an infinite generalized inverted Dirichlet mixture model (InGIDMM)] has been proposed. The generalized inverted Dirichlet distribution has been proven to be efficient in modeling the vectors that contain only positive elements. Under the classical variational inference (VI) framework, the key challenge in the Bayesian estimation of InGIDMM is that the expectation of the joint distribution of data and variables cannot be explicitly calculated. Therefore, numerical methods are usually applied to simulate the optimal posterior distributions. With the recently proposed extended VI (EVI) framework, we introduce lower bound approximations to the original variational objective function in the VI framework such that an analytically tractable solution can be derived. Hence, the problem in numerical simulation has been overcome. By applying the DP mixture technique, an InGIDMM can automatically determine the number of mixture components from the observed data. Moreover, the DP mixture model with an infinite number of mixture components also avoids the problems of underfitting and overfitting. The performance of the proposed approach is demonstrated with both synthesized data and real-life data applications.
URL查看原文
收录类别SCI ; SCIE ; EI
来源库IEEE
引用统计
正在获取...
文献类型期刊论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/135745
专题信息科学与技术学院
信息科学与技术学院_PI研究组_虞晶怡组
作者单位
1.Pattern Recognition and Intelligent System Laboratory, School of Artificial Intelligence, Beijing University of Posts and Telecommunications, Beijing, China
2.School of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing, China
3.School of Mathematics and Statistics, Xi’an Jiaotong University, Xi’an, China
4.Faculty of Information Technology, Macau University of Science and Technology, Macau, China
5.School of Engineering and Computer Science, Victoria University of Wellington, Wellington, New Zealand
6.School of Information Science and Technology, ShanghaiTech University, Shanghai, China
推荐引用方式
GB/T 7714
Zhanyu Ma,Yuping Lai,Jiyang Xie,et al. Dirichlet Process Mixture of Generalized Inverted Dirichlet Distributions for Positive Vector Data With Extended Variational Inference[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022,33(11).
APA Zhanyu Ma.,Yuping Lai.,Jiyang Xie.,Deyu Meng.,W. Bastiaan Kleijn.,...&Jingyi Yu.(2022).Dirichlet Process Mixture of Generalized Inverted Dirichlet Distributions for Positive Vector Data With Extended Variational Inference.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,33(11).
MLA Zhanyu Ma,et al."Dirichlet Process Mixture of Generalized Inverted Dirichlet Distributions for Positive Vector Data With Extended Variational Inference".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 33.11(2022).
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Zhanyu Ma]的文章
[Yuping Lai]的文章
[Jiyang Xie]的文章
百度学术
百度学术中相似的文章
[Zhanyu Ma]的文章
[Yuping Lai]的文章
[Jiyang Xie]的文章
必应学术
必应学术中相似的文章
[Zhanyu Ma]的文章
[Yuping Lai]的文章
[Jiyang Xie]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。