Metric: In-domain (higher is better)
| # | Model↕ | In-domain▼ | Extra Data | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | BERT Large Augmented (single model) | 82.5 | No | BERT: Pre-training of Deep Bidirectional Transfo... | 2018-10-11 | Code |
| 2 | BERT-base finetune (single model) | 79.8 | No | BERT: Pre-training of Deep Bidirectional Transfo... | 2018-10-11 | Code |
| 3 | BiDAF++ (single model) | 69.4 | No | A Qualitative Comparison of CoQA, SQuAD 2.0 and ... | 2018-09-27 | Code |
| 4 | DrQA + seq2seq with copy attention (single model) | 67 | No | CoQA: A Conversational Question Answering Challe... | 2018-08-21 | Code |
| 5 | Vanilla DrQA (single model) | 54.5 | No | CoQA: A Conversational Question Answering Challe... | 2018-08-21 | Code |