| |||||||
ShanghaiTech University Knowledge Management System
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale | |
2024 | |
会议录名称 | PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS |
发表状态 | 已发表 |
摘要 | A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner. We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion. We pre-train GPST on OpenWebText, a corpus with 9 billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training.(1) |
会议名称 | 62nd Annual Meeting of the Association-for-Computational-Linguistics (ACL) / Student Research Workshop (SRW) |
出版地 | 209 N EIGHTH STREET, STROUDSBURG, PA 18360 USA |
会议地点 | null,Bangkok,THAILAND |
会议日期 | AUG 11-16, 2024 |
URL | 查看原文 |
收录类别 | CPCI-S |
语种 | 英语 |
WOS研究方向 | Computer Science |
WOS类目 | Computer Science, Artificial Intelligence ; Computer Science, Interdisciplinary Applications ; Computer Science, Theory & Methods |
WOS记录号 | WOS:001356729802043 |
出版者 | ASSOC COMPUTATIONAL LINGUISTICS-ACL |
文献类型 | 会议论文 |
条目标识符 | https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/372963 |
专题 | 信息科学与技术学院_硕士生 信息科学与技术学院_PI研究组_屠可伟组 信息科学与技术学院_本科生 |
通讯作者 | Wu, Wei; Tu, Kewei |
作者单位 | 1.Ant Grp, Hangzhou, Peoples R China 2.ShanghaiTech Univ, Shanghai, Peoples R China |
通讯作者单位 | 上海科技大学 |
推荐引用方式 GB/T 7714 | Hu, Xiang,Ji, Pengyu,Zhu, Qingyang,et al. Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale[C]. 209 N EIGHTH STREET, STROUDSBURG, PA 18360 USA:ASSOC COMPUTATIONAL LINGUISTICS-ACL,2024. |
条目包含的文件 | ||||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 |
修改评论
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。