Jordan Clive, Kris Cao, Marek Rei
Prefix-tuning is a powerful lightweight technique for adapting a large pre-trained language model to a downstream application. However, it uses the same dataset-level tuned prompt for all examples in the dataset. We extend this idea and propose a dynamic method, Control Prefixes, which allows for the inclusion of conditional input-dependent information, combining the benefits of prompt tuning and controlled generation. The method incorporates attribute-level learnable representations into different layers of a pre-trained transformer, allowing for the generated text to be guided in a particular direction. We provide a systematic evaluation of the technique and apply it to five datasets from the GEM benchmark for natural language generation (NLG). Although the aim is to develop a parameter-efficient model, we show Control Prefixes can even outperform full fine-tuning methods. We present state-of-the-art results on several data-to-text datasets, including WebNLG.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Text Generation | DART | METEOR | 0.411 | Control Prefixes (T5-large) |
| Text Generation | WebNLG | BLEU | 67.32 | Control Prefixes (A1, T5-large) |
| Text Generation | WebNLG | BLEU | 67.15 | Control Prefixes (A1, A2, T5-large) |
| Text Generation | Cleaned E2E NLG Challenge | BLEU (Test set) | 44.15 | Control Prefixes (T5-large) |
| Text Generation | WebNLG Full | BLEU | 62.27 | Control Prefixes (A1, A2, T5-large) |
| Text Generation | WebNLG Full | BLEU | 61.94 | Control Prefixes (A1, T5-large) |
| Text Simplification | TurkCorpus | FKGL | 7.74 | Control Prefixes (BART) |
| Text Simplification | TurkCorpus | QuestEval (Reference-less, BERTScore) | 0.66 | Control Prefixes (BART) |
| Text Simplification | TurkCorpus | SARI (EASSE>=0.2.1) | 42.32 | Control Prefixes (BART) |
| Text Simplification | ASSET | FKGL | 5.97 | Control Prefixes (BART) |
| Text Simplification | ASSET | QuestEval (Reference-less, BERTScore) | 0.64 | Control Prefixes (BART) |
| Text Simplification | ASSET | SARI (EASSE>=0.2.1) | 43.58 | Control Prefixes (BART) |
| Data-to-Text Generation | WebNLG | BLEU | 67.32 | Control Prefixes (A1, T5-large) |
| Data-to-Text Generation | WebNLG | BLEU | 67.15 | Control Prefixes (A1, A2, T5-large) |
| Data-to-Text Generation | Cleaned E2E NLG Challenge | BLEU (Test set) | 44.15 | Control Prefixes (T5-large) |
| Data-to-Text Generation | WebNLG Full | BLEU | 62.27 | Control Prefixes (A1, A2, T5-large) |
| Data-to-Text Generation | WebNLG Full | BLEU | 61.94 | Control Prefixes (A1, T5-large) |