Augmenting Transformers with Recursively Composed Multi-grained Representations
2024-03-12
状态已发表
摘要

We present ReCAT, a recursive composition augmented Transformer that is able to explicitly model hierarchical syntactic structures of raw texts without relying on gold trees during both learning and inference. Existing research along this line restricts data to follow a hierarchical tree structure and thus lacks inter-span communications. To overcome the problem, we propose a novel contextual inside-outside (CIO) layer that learns contextualized representations of spans through bottom-up and top-down passes, where a bottom-up pass forms representations of high-level spans by composing low-level spans, while a top-down pass combines information inside and outside a span. By stacking several CIO layers between the embedding layer and the attention layers in Transformer, the ReCAT model can perform both deep intra-span and deep inter-span interactions, and thus generate multi-grained representations fully contextualized with other spans. Moreover, the CIO layers can be jointly pre-trained with Transformers, making ReCAT enjoy scaling ability, strong performance, and interpretability at the same time. We conduct experiments on various sentence-level and span-level tasks. Evaluation results indicate that ReCAT can significantly outperform vanilla Transformer models on all span-level tasks and baselines that combine recursive networks with Transformers on natural language inference tasks. More interestingly, the hierarchical structures induced by ReCAT exhibit strong consistency with human-annotated syntactic trees, indicating good interpretability brought by the CIO layers.

DOIarXiv:2309.16319
相关网址查看原文
出处Arxiv
WOS记录号PPRN:85321786
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Interdisciplinary Applications
文献类型预印本
条目标识符https://kms.shanghaitech.edu.cn/handle/2MSLDSTB/372970
专题信息科学与技术学院_本科生
信息科学与技术学院_PI研究组_屠可伟组
通讯作者Tu, Kewei; Wu, Wei
作者单位
1.Ant Grp, Hangzhou, Peoples R China
2.ShanghaiTech Univ, Shanghai, Peoples R China
推荐引用方式
GB/T 7714
Hu, Xiang,Zhu, Qingyang,Tu, Kewei,et al. Augmenting Transformers with Recursively Composed Multi-grained Representations. 2024.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Hu, Xiang]的文章
[Zhu, Qingyang]的文章
[Tu, Kewei]的文章
百度学术
百度学术中相似的文章
[Hu, Xiang]的文章
[Zhu, Qingyang]的文章
[Tu, Kewei]的文章
必应学术
必应学术中相似的文章
[Hu, Xiang]的文章
[Zhu, Qingyang]的文章
[Tu, Kewei]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。