浏览条目

浏览/检索结果: 共127条,第1-10条 帮助

限定条件                
已选(0)清除 条数/页:   排序方式:
用于大语言模型的层压缩键值缓存方法、系统、设备及介质 专利
申请号:CN202410423760.7,申请日期: 2024-06-25,类型:发明申请,状态:实质审查
发明人:  屠可伟;  吴昊一
Adobe PDF(731Kb)  |  收藏  |  浏览/下载:25/0  |  提交时间:2024/06/25
Sparser is Faster and Less is More: Efficient Sparse Attention for Long-Range Transformers 预印本
2024
作者:  Lou, Chao;  Jia, Zixia;  Zheng, Zilong;  Tu, Kewei
Adobe PDF(2152Kb)  |  收藏  |  浏览/下载:8/1  |  提交时间:2024/07/08
Layer-Condensed KV Cache for Efficient Inference of Large Language Models 预印本
2024
作者:  Wu, Haoyi;  Tu, Kewei
Adobe PDF(590Kb)  |  收藏  |  浏览/下载:22/1  |  提交时间:2024/06/17
句法语言模型的无监督训练方法及装置 专利
申请号:CN202410296243.8,申请日期: 2024-05-10,类型:发明申请,状态:实质审查
发明人:  胡翔;  武威;  屠可伟
Adobe PDF(829Kb)  |  收藏  |  浏览/下载:53/0  |  提交时间:2024/05/10
Potential and Limitations of LLMs in Capturing Structured Semantics: A Case Study on SRL 预印本
2024
作者:  Cheng, Ning;  Yan, Zhaohui;  Wang, Ziming;  Li, Zhijie;  Yu, Jiaming
Adobe PDF(680Kb)  |  收藏  |  浏览/下载:23/1  |  提交时间:2024/06/17
Improving Retrieval Augmented Open-Domain Question-Answering with Vectorized Contexts 预印本
2024
作者:  Chen, Zhuo;  Wang, Xinyu;  Jiang, Yong;  Xie, Pengjun;  Huang, Fei
收藏  |  浏览/下载:34/0  |  提交时间:2024/05/15
Using Interpretation Methods for Model Enhancement 预印本
2024
作者:  Chen, Zhuo;  Jiang, Chengyue;  Tu, Kewei
收藏  |  浏览/下载:33/0  |  提交时间:2024/05/15
SeqGPT: An Out-of-the-Box Large Language Model for Open Domain Sequence Understanding 会议论文
PROCEEDINGS OF THE AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, Vancouver, BC, Canada, February 20, 2024 - February 27, 2024
作者:  Yu, Tianyu;  Jiang, Chengyue;  Lou, Chao;  Huang, Shen;  Wang, Xiaobin
Adobe PDF(380Kb)  |  收藏  |  浏览/下载:55/0  |  提交时间:2024/04/26
Frame Semantic Role Labeling Using Arbitrary-Order Conditional Random Fields 会议论文
PROCEEDINGS OF THE AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, Vancouver, BC, Canada, February 20, 2024 - February 27, 2024
作者:  Ai, Chaoyi;  Tu, Kewei
Adobe PDF(615Kb)  |  收藏  |  浏览/下载:68/4  |  提交时间:2024/04/26
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale 预印本
2024
作者:  Hu, Xiang;  Ji, Pengyu;  Zhu, Qingyang;  Wu, Wei;  Tu, Kewei
Adobe PDF(2863Kb)  |  收藏  |  浏览/下载:42/1  |  提交时间:2024/05/15