报告题目:Prefix-Tuning: Optimizing Continuous Prompts for Generation (2021)
论文出处:ACL 2021
作者:Xiang Lisa Li、Percy Liang
单位:Stanford University
报告人:胡华彦
报告时间:2024年1月31日
报告地点:博学楼621会议室
报告内容摘要:Fine-tuning is the de facto way to leverage large pretrained language models to perform downstream tasks. However, it modifies all the language model parameters and therefore necessitates storing a full copy for each task. In this paper, we propose prefix-tuning, a lightweight alternative to fine-tuning for natural language generation tasks, which keeps language model parameters frozen, but optimizes a small continuous task-specific vector (called the prefix). Prefix-tuning draws inspiration from prompting, allowing subsequent tokens to attend to this prefix as if it were virtual tokens. We apply prefix-tuning to GPT-2 for table-to-text generation and to BART for summarization. We find that by learning only 0.1\% of the parameters, prefix-tuning obtains comparable performance in the full data setting, outperforms fine-tuning in low-data settings, and extrapolates better to examples with topics unseen during training.