Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale
2024
会议录名称PROCEEDINGS OF THE ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS
ISSN0736-587X
卷号1
页码2640-2657
发表状态已发表
摘要

A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner. We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion. We pre-train GPST on OpenWebText, a corpus with 9 billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training. © 2024 Association for Computational Linguistics.

会议录编者/会议主办者Apple ; et al. ; Google DeepMind ; LG AI Research ; Meta AI ; NewsBreak
关键词Context sensitive grammars Distribution transformers Syntactics Trees (mathematics) Bi-directional Composition modeling Language generation Language model Language understanding Parallel training Syntactic languages Syntactic parse tree Syntactic trees Two-component
会议名称62nd Annual Meeting of the Association for Computational Linguistics, ACL 2024
会议地点Bangkok, Thailand
会议日期August 11, 2024 - August 16, 2024
收录类别EI
语种英语
出版者Association for Computational Linguistics (ACL)
EI入藏号20243917091251
EI主题词Modeling languages
EI分类号1102.1 ; 1106.4 ; 1201.8 ; 706.1.2 Electric Power Distribution ; 706.2 Electric Power Lines and Equipment
原始文献类型Conference article (CA)
文献类型会议论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/430532
专题信息科学与技术学院_本科生
信息科学与技术学院_PI研究组_屠可伟组
信息科学与技术学院_硕士生
通讯作者Wu, Wei; Tu, Kewei
作者单位
1.Ant Group, China
2.ShanghaiTech University, China
通讯作者单位上海科技大学
推荐引用方式
GB/T 7714
Hu, Xiang,Ji, Pengyu,Zhu, Qingyang,et al. Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale[C]//Apple, et al., Google DeepMind, LG AI Research, Meta AI, NewsBreak:Association for Computational Linguistics (ACL),2024:2640-2657.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Hu, Xiang]的文章
[Ji, Pengyu]的文章
[Zhu, Qingyang]的文章
百度学术
百度学术中相似的文章
[Hu, Xiang]的文章
[Ji, Pengyu]的文章
[Zhu, Qingyang]的文章
必应学术
必应学术中相似的文章
[Hu, Xiang]的文章
[Ji, Pengyu]的文章
[Zhu, Qingyang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。