Bailin Wang, Wei Lu, Yu Wang, Hongxia Jin
It is common that entity mentions can contain other mentions recursively. This paper introduces a scalable transition-based method to model the nested structure of mentions. We first map a sentence with nested mentions to a designated forest where each mention corresponds to a constituent of the forest. Our shift-reduce based system then learns to construct the forest structure in a bottom-up manner through an action sequence whose maximal length is guaranteed to be three times of the sentence length. Based on Stack-LSTM which is employed to efficiently and effectively represent the states of the system in a continuous space, our system is further incorporated with a character-based component to capture letter-level patterns. Our model achieves the state-of-the-art results on ACE datasets, showing its effectiveness in detecting nested mentions.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Named Entity Recognition (NER) | ACE 2004 | F1 | 73.3 | Neural transition-based model |
| Named Entity Recognition (NER) | ACE 2005 | F1 | 73 | Neural transition-based model |
| Named Entity Recognition (NER) | GENIA | F1 | 73.9 | Neural transition-based model |
| Named Entity Recognition (NER) | ACE 2005 | F1 | 73 | neural transition-based model |
| Named Entity Recognition (NER) | ACE 2004 | F1 | 73.3 | Neural transition-based model |
| Named Entity Recognition (NER) | GENIA | F1 | 73.9 | Neural transition-based model |
| Named Entity Recognition (NER) | NNE | Micro F1 | 73.6 | Neural Transition-based Model |
| Nested Mention Recognition | ACE 2005 | F1 | 73 | Neural transition-based model |
| Nested Mention Recognition | ACE 2004 | F1 | 73.1 | Neural transition-based model |