ShanghaiTech University Knowledge Management System
Combining generative and discriminative approaches to unsupervised dependency parsing via dual decomposition | |
2017 | |
会议录名称 | 2017 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2017 |
页码 | 1689-1694 |
发表状态 | 已发表 |
摘要 | Unsupervised dependency parsing aims to learn a dependency parser from unannotated sentences. Existing work focuses on either learning generative models using the expectation-maximization algorithm and its variants, or learning discriminative models using the discriminative clustering algorithm. In this paper, we propose a new learning strategy that learns a generative model and a discriminative model jointly based on the dual decomposition method. Our method is simple and general, yet effective to capture the advantages of both models and improve their learning results. We tested our method on the UD treebank and achieved a state-of-the-art performance on thirty languages. © 2017 Association for Computational Linguistics. |
会议地点 | Copenhagen, Denmark |
收录类别 | EI |
资助项目 | National Natural Science Foundation of China[61503248] |
出版者 | Association for Computational Linguistics (ACL) |
EI入藏号 | 20194207538745 |
EI主题词 | Clustering algorithms ; Image segmentation ; Maximum principle ; Natural language processing systems ; Syntactics |
EI分类号 | Data Processing and Image Processing:723.2 ; Information Sources and Analysis:903.1 |
原始文献类型 | Conference article (CA) |
文献类型 | 会议论文 |
条目标识符 | https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/29234 |
专题 | 信息科学与技术学院_博士生 信息科学与技术学院_PI研究组_屠可伟组 |
作者单位 | School of Information Science and Technology, ShanghaiTech University, Shanghai, China |
第一作者单位 | 信息科学与技术学院 |
第一作者的第一单位 | 信息科学与技术学院 |
推荐引用方式 GB/T 7714 | Jiang, Yong,Han, Wenjuan,Tu, Kewei. Combining generative and discriminative approaches to unsupervised dependency parsing via dual decomposition[C]:Association for Computational Linguistics (ACL),2017:1689-1694. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 |
修改评论
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。