Pedro Henrique Martins, Zita Marinho, André F. T. Martins
Transformers are unable to model long-term memories effectively, since the amount of computation they need to perform grows with the context length. While variations of efficient transformers have been proposed, they all have a finite memory capacity and are forced to drop old information. In this paper, we propose the $\infty$-former, which extends the vanilla transformer with an unbounded long-term memory. By making use of a continuous-space attention mechanism to attend over the long-term memory, the $\infty$-former's attention complexity becomes independent of the context length, trading off memory length with precision. In order to control where precision is more important, $\infty$-former maintains "sticky memories" being able to model arbitrarily long contexts while keeping the computation budget fixed. Experiments on a synthetic sorting task, language modeling, and document grounded dialogue generation demonstrate the $\infty$-former's ability to retain information from long sequences.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Dialogue | PG-19 | Perplexity | 32.48 | ∞-former (Sticky memories + initialized GPT-2 Small) |
| Dialogue | CMU-DoG | F1 | 9.01 | ∞-former (Sticky memories) |
| Dialogue | CMU-DoG | Meteor | 7.55 | ∞-former (Sticky memories) |
| Dialogue | CMU-DoG | ROUGE-1 | 15.37 | ∞-former (Sticky memories) |
| Dialogue | CMU-DoG | Rouge-L | 12.56 | ∞-former (Sticky memories) |
| Text Generation | PG-19 | Perplexity | 32.48 | ∞-former (Sticky memories + initialized GPT-2 Small) |
| Text Generation | CMU-DoG | F1 | 9.01 | ∞-former (Sticky memories) |
| Text Generation | CMU-DoG | Meteor | 7.55 | ∞-former (Sticky memories) |
| Text Generation | CMU-DoG | ROUGE-1 | 15.37 | ∞-former (Sticky memories) |
| Text Generation | CMU-DoG | Rouge-L | 12.56 | ∞-former (Sticky memories) |
| Language Modelling | WikiText-103 | Test perplexity | 16.61 | [?]-former (SM) |
| Language Modelling | WikiText-103 | Test perplexity | 16.61 | -former (SM) |
| Language Modelling | WikiText-103 | Test perplexity | 16.61 | ∞-former (Sticky memories + initialized GPT-2 Small) |
| Language Modelling | WikiText-103 | Test perplexity | 16.64 | ∞-former (initialized GPT-2 Small) |
| Language Modelling | WikiText-103 | Test perplexity | 24.22 | [?]-former (Sticky memories) |
| Language Modelling | WikiText-103 | Test perplexity | 24.22 | \infty-former (Sticky memories) |
| Language Modelling | WikiText-103 | Test perplexity | 24.22 | ∞-former (Sticky memories) |
| Chatbot | PG-19 | Perplexity | 32.48 | ∞-former (Sticky memories + initialized GPT-2 Small) |
| Chatbot | CMU-DoG | F1 | 9.01 | ∞-former (Sticky memories) |
| Chatbot | CMU-DoG | Meteor | 7.55 | ∞-former (Sticky memories) |
| Chatbot | CMU-DoG | ROUGE-1 | 15.37 | ∞-former (Sticky memories) |
| Chatbot | CMU-DoG | Rouge-L | 12.56 | ∞-former (Sticky memories) |
| Dialogue Generation | PG-19 | Perplexity | 32.48 | ∞-former (Sticky memories + initialized GPT-2 Small) |
| Dialogue Generation | CMU-DoG | F1 | 9.01 | ∞-former (Sticky memories) |
| Dialogue Generation | CMU-DoG | Meteor | 7.55 | ∞-former (Sticky memories) |
| Dialogue Generation | CMU-DoG | ROUGE-1 | 15.37 | ∞-former (Sticky memories) |
| Dialogue Generation | CMU-DoG | Rouge-L | 12.56 | ∞-former (Sticky memories) |