ShanghaiTech University Knowledge Management System
Dependency grammar induction with neural lexicalization and big training data | |
2017 | |
会议录名称 | 2017 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2017 |
页码 | 1683-1688 |
发表状态 | 已发表 |
摘要 | We study the impact of big models (in terms of the degree of lexicalization) and big data (in terms of the training corpus size) on dependency grammar induction. We experimented with L-DMV, a lexicalized version of Dependency Model with Valence (Klein and Manning, 2004) and L-NDMV, our lexicalized extension of the Neural Dependency Model with Valence (Jiang et al., 2016). We find that L-DMV only benefits from very small degrees of lexicalization and moderate sizes of training corpora. L-NDMV can benefit from big training data and lexicalization of greater degrees, especially when enhanced with good model initialization, and it achieves a result that is competitive with the current state-of-the-art. © 2017 Association for Computational Linguistics. |
会议地点 | Copenhagen, Denmark |
收录类别 | EI |
资助项目 | National Natural Science Foundation of China[61503248] |
出版者 | Association for Computational Linguistics (ACL) |
EI入藏号 | 20194207538744 |
EI主题词 | Computational grammars ; Linguistics |
EI分类号 | Computer Theory, Includes Formal Logic, Automata Theory, Switching Theory, Programming Theory:721.1 ; Data Processing and Image Processing:723.2 |
原始文献类型 | Conference article (CA) |
文献类型 | 会议论文 |
条目标识符 | https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/29237 |
专题 | 信息科学与技术学院_博士生 信息科学与技术学院_PI研究组_屠可伟组 |
作者单位 | School of Information Science and Technology, ShanghaiTech University, Shanghai, China |
第一作者单位 | 信息科学与技术学院 |
第一作者的第一单位 | 信息科学与技术学院 |
推荐引用方式 GB/T 7714 | Han, Wenjuan,Jiang, Yong,Tu, Kewei. Dependency grammar induction with neural lexicalization and big training data[C]:Association for Computational Linguistics (ACL),2017:1683-1688. |
条目包含的文件 | 下载所有文件 | |||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 |
修改评论
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。