当前位置: 首页  2014贵州省先进计算与医疗信息服务工程实验室  通知公告
20240131论文报告-Prefix-Tuning: Optimizing Continuous Prompts for Generation (2021)

报告题目:Prefix-Tuning: Optimizing Continuous Prompts for Generation (2021)

论文出处:ACL 2021

作者:Xiang Lisa Li、Percy Liang

单位:Stanford University

报告人:胡华彦

报告时间:2024年1月31日

报告地点:博学楼621会议室

报告内容摘要:Fine-tuning is the de facto way to leverage large pretrained language models to perform downstream tasks. However, it modifies all the language model parameters and therefore necessitates storing a full copy for each task. In this paper, we propose prefix-tuning, a lightweight alternative to fine-tuning for natural language generation tasks, which keeps language model parameters frozen, but optimizes a small continuous task-specific vector (called the prefix). Prefix-tuning draws inspiration from prompting, allowing subsequent tokens to attend to this prefix as if it were virtual tokens. We apply prefix-tuning to GPT-2 for table-to-text generation and to BART for summarization. We find that by learning only 0.1\% of the parameters, prefix-tuning obtains comparable performance in the full data setting, outperforms fine-tuning in low-data settings, and extrapolates better to examples with topics unseen during training.


【关闭本页】 【返回顶部】