消息
×
loading..
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale
2024
会议录名称PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS
发表状态已发表
摘要A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner. We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion. We pre-train GPST on OpenWebText, a corpus with 9 billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training.(1)
会议名称62nd Annual Meeting of the Association-for-Computational-Linguistics (ACL) / Student Research Workshop (SRW)
出版地209 N EIGHTH STREET, STROUDSBURG, PA 18360 USA
会议地点null,Bangkok,THAILAND
会议日期AUG 11-16, 2024
URL查看原文
收录类别CPCI-S
语种英语
WOS研究方向Computer Science
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Interdisciplinary Applications ; Computer Science, Theory & Methods
WOS记录号WOS:001356729802043
出版者ASSOC COMPUTATIONAL LINGUISTICS-ACL
文献类型会议论文
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/372963
专题信息科学与技术学院_硕士生
信息科学与技术学院_PI研究组_屠可伟组
信息科学与技术学院_本科生
通讯作者Wu, Wei; Tu, Kewei
作者单位
1.Ant Grp, Hangzhou, Peoples R China
2.ShanghaiTech Univ, Shanghai, Peoples R China
通讯作者单位上海科技大学
推荐引用方式
GB/T 7714
Hu, Xiang,Ji, Pengyu,Zhu, Qingyang,et al. Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale[C]. 209 N EIGHTH STREET, STROUDSBURG, PA 18360 USA:ASSOC COMPUTATIONAL LINGUISTICS-ACL,2024.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Hu, Xiang]的文章
[Ji, Pengyu]的文章
[Zhu, Qingyang]的文章
百度学术
百度学术中相似的文章
[Hu, Xiang]的文章
[Ji, Pengyu]的文章
[Zhu, Qingyang]的文章
必应学术
必应学术中相似的文章
[Hu, Xiang]的文章
[Ji, Pengyu]的文章
[Zhu, Qingyang]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。