Deming Ye, Yankai Lin, Peng Li, Maosong Sun
Recent entity and relation extraction works focus on investigating how to obtain a better span representation from the pre-trained encoder. However, a major limitation of existing works is that they ignore the interrelation between spans (pairs). In this work, we propose a novel span representation approach, named Packed Levitated Markers (PL-Marker), to consider the interrelation between the spans (pairs) by strategically packing the markers in the encoder. In particular, we propose a neighborhood-oriented packing strategy, which considers the neighbor spans integrally to better model the entity boundary information. Furthermore, for those more complicated span pair classification tasks, we design a subject-oriented packing strategy, which packs each subject and all its objects to model the interrelation between the same-subject span pairs. The experimental results show that, with the enhanced marker feature, our model advances baselines on six NER benchmarks, and obtains a 4.1%-4.3% strict relation F1 improvement with higher speed over previous state-of-the-art models on ACE04 and ACE05.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Relation Extraction | ACE 2005 | NER Micro F1 | 91.1 | PL-Marker |
| Relation Extraction | ACE 2005 | RE Micro F1 | 73 | PL-Marker |
| Relation Extraction | ACE 2005 | RE+ Micro F1 | 71.1 | PL-Marker |
| Relation Extraction | ACE 2004 | NER Micro F1 | 90.4 | PL-Marker |
| Relation Extraction | ACE 2004 | RE Micro F1 | 69.7 | PL-Marker |
| Relation Extraction | ACE 2004 | RE+ Micro F1 | 66.5 | PL-Marker |
| Relation Extraction | SciERC | Entity F1 | 69.9 | PL-Marker |
| Relation Extraction | SciERC | RE+ Micro F1 | 41.6 | PL-Marker |
| Relation Extraction | SciERC | Relation F1 | 53.2 | PL-Marker |
| Information Extraction | SciERC | Entity F1 | 69.9 | PL-Marker |
| Information Extraction | SciERC | RE+ Micro F1 | 41.6 | PL-Marker |
| Information Extraction | SciERC | Relation F1 | 53.2 | PL-Marker |
| Named Entity Recognition (NER) | Ontonotes v5 (English) | F1 | 91.9 | PL-Marker |
| Named Entity Recognition (NER) | Ontonotes v5 (English) | Precision | 92 | PL-Marker |
| Named Entity Recognition (NER) | Ontonotes v5 (English) | Recall | 91.7 | PL-Marker |
| Named Entity Recognition (NER) | Few-NERD (SUP) | F1-Measure | 70.9 | PL-Marker |
| Named Entity Recognition (NER) | Few-NERD (SUP) | Precision | 71.2 | PL-Marker |
| Named Entity Recognition (NER) | Few-NERD (SUP) | Recall | 70.6 | PL-Marker |
| Named Entity Recognition (NER) | CoNLL 2003 (English) | F1 | 94 | PL-Marker |