Xinyu Wang, Kewei Tu
In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Dependency Parsing | Chinese Treebank | LAS | 91.69 | MFVI |
| Dependency Parsing | Chinese Treebank | UAS | 92.78 | MFVI |
| Dependency Parsing | Penn Treebank | LAS | 95.34 | MFVI |
| Dependency Parsing | Penn Treebank | UAS | 96.91 | MFVI |