已选(0)清除
条数/页: 排序方式:
|
| A Systematic Study of Cross-Layer KV Sharing for Efficient LLM Inference 预印本 2024 作者: Wu, You; Wu, Haoyi; Tu, Kewei 收藏  |  浏览/下载:8/0  |  提交时间:2024/11/11 |
| Learning Robust Named Entity Recognizers From Noisy Data With Retrieval Augmentation 预印本 2024 作者: Ai, Chaoyi; Jiang, Yong; Huang, Shen; Xie, Pengjun; Tu, Kewei Adobe PDF(822Kb)  |  收藏  |  浏览/下载:76/0  |  提交时间:2024/08/14
|
| Dependency Transformer Grammars: Integrating Dependency Structures into Transformer Language Models 预印本 2024 作者: Zhao, Yida; Lou, Chao; Tu, Kewei Adobe PDF(559Kb)  |  收藏  |  浏览/下载:92/2  |  提交时间:2024/08/14 |
| 用于大语言模型的层压缩键值缓存方法、系统、设备及介质 专利 申请号:CN202410423760.7,申请日期: 2024-06-25,类型:发明申请,状态:实质审查 发明人: 屠可伟; 吴昊一 Adobe PDF(731Kb)  |  收藏  |  浏览/下载:136/1  |  提交时间:2024/06/25 |
| Sparser is Faster and Less is More: Efficient Sparse Attention for Long-Range Transformers 预印本 2024 作者: Lou, Chao; Jia, Zixia; Zheng, Zilong; Tu, Kewei Adobe PDF(2152Kb)  |  收藏  |  浏览/下载:93/2  |  提交时间:2024/07/08 |
| Layer-Condensed KV Cache for Efficient Inference of Large Language Models 预印本 2024 作者: Wu, Haoyi; Tu, Kewei Adobe PDF(590Kb)  |  收藏  |  浏览/下载:115/2  |  提交时间:2024/06/17 |
| 句法语言模型的无监督训练方法及装置 专利 申请号:CN202410296243.8,申请日期: 2024-05-10,类型:发明申请,状态:实质审查 发明人: 胡翔; 武威; 屠可伟 Adobe PDF(829Kb)  |  收藏  |  浏览/下载:154/0  |  提交时间:2024/05/10 |
| Potential and Limitations of LLMs in Capturing Structured Semantics: A Case Study on SRL 预印本 2024 作者: Cheng, Ning; Yan, Zhaohui; Wang, Ziming; Li, Zhijie; Yu, Jiaming Adobe PDF(680Kb)  |  收藏  |  浏览/下载:106/1  |  提交时间:2024/06/17
|
| Using Interpretation Methods for Model Enhancement 预印本 2024 作者: Chen, Zhuo; Jiang, Chengyue; Tu, Kewei Adobe PDF(1566Kb)  |  收藏  |  浏览/下载:129/0  |  提交时间:2024/05/15 |
| Frame Semantic Role Labeling Using Arbitrary-Order Conditional Random Fields 会议论文 PROCEEDINGS OF THE AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, Vancouver, BC, Canada, February 20, 2024 - February 27, 2024 作者: Ai, Chaoyi; Tu, Kewei Adobe PDF(615Kb)  |  收藏  |  浏览/下载:167/4  |  提交时间:2024/04/26
|