ShanghaiTech University Knowledge Management System
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale | |
2024 | |
会议录名称 | PROCEEDINGS OF THE ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS |
ISSN | 0736-587X |
卷号 | 1 |
页码 | 2640-2657 |
发表状态 | 已发表 |
摘要 | A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner. We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion. We pre-train GPST on OpenWebText, a corpus with 9 billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training. © 2024 Association for Computational Linguistics. |
会议录编者/会议主办者 | Apple ; et al. ; Google DeepMind ; LG AI Research ; Meta AI ; NewsBreak |
关键词 | Context sensitive grammars Distribution transformers Syntactics Trees (mathematics) Bi-directional Composition modeling Language generation Language model Language understanding Parallel training Syntactic languages Syntactic parse tree Syntactic trees Two-component |
会议名称 | 62nd Annual Meeting of the Association for Computational Linguistics, ACL 2024 |
会议地点 | Bangkok, Thailand |
会议日期 | August 11, 2024 - August 16, 2024 |
收录类别 | EI |
语种 | 英语 |
出版者 | Association for Computational Linguistics (ACL) |
EI入藏号 | 20243917091251 |
EI主题词 | Modeling languages |
EI分类号 | 1102.1 ; 1106.4 ; 1201.8 ; 706.1.2 Electric Power Distribution ; 706.2 Electric Power Lines and Equipment |
原始文献类型 | Conference article (CA) |
文献类型 | 会议论文 |
条目标识符 | https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/430532 |
专题 | 信息科学与技术学院_本科生 信息科学与技术学院_PI研究组_屠可伟组 信息科学与技术学院_硕士生 |
通讯作者 | Wu, Wei; Tu, Kewei |
作者单位 | 1.Ant Group, China 2.ShanghaiTech University, China |
通讯作者单位 | 上海科技大学 |
推荐引用方式 GB/T 7714 | Hu, Xiang,Ji, Pengyu,Zhu, Qingyang,et al. Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale[C]//Apple, et al., Google DeepMind, LG AI Research, Meta AI, NewsBreak:Association for Computational Linguistics (ACL),2024:2640-2657. |
条目包含的文件 | ||||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 |
修改评论
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。