TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Efficient Second-Order TreeCRF for Neural Dependency Parsing

Efficient Second-Order TreeCRF for Neural Dependency Parsing

Yu Zhang, Zhenghua Li, Min Zhang

2020-05-03ACL 2020 6Dependency Parsing
PaperPDFCode(official)Code

Abstract

In the deep learning (DL) era, parsing models are extremely simplified with little hurt on performance, thanks to the remarkable capability of multi-layer BiLSTMs in context representation. As the most popular graph-based dependency parser due to its high efficiency and performance, the biaffine parser directly scores single dependencies under the arc-factorization assumption, and adopts a very simple local token-wise cross-entropy training loss. This paper for the first time presents a second-order TreeCRF extension to the biaffine parser. For a long time, the complexity and inefficiency of the inside-outside algorithm hinder the popularity of TreeCRF. To address this issue, we propose an effective way to batchify the inside and Viterbi algorithms for direct large matrix operation on GPUs, and to avoid the complex outside algorithm via efficient back-propagation. Experiments and analysis on 27 datasets from 13 languages clearly show that techniques developed before the DL era, such as structural learning (global TreeCRF loss) and high-order modeling are still useful, and can further boost parsing performance over the state-of-the-art biaffine parser, especially for partially annotated training data. We release our code at https://github.com/yzhangcs/crfpar.

Results

TaskDatasetMetricValueModel
Dependency ParsingCoNLL-2009LAS86.52CRFPar
Dependency ParsingCoNLL-2009UAS89.63CRFPar
Dependency ParsingPenn TreebankLAS94.49CRFPar
Dependency ParsingPenn TreebankUAS96.14CRFPar
Dependency ParsingNLPCC-2019LAS72.33CRFPar
Dependency ParsingNLPCC-2019UAS78.02CRFPar

Related Papers

Step-by-step Instructions and a Simple Tabular Output Format Improve the Dependency Parsing Accuracy of LLMs2025-06-11UD-KSL Treebank v1.3: A semi-automated framework for aligning XPOS-extracted units with UPOS tags2025-06-10LKD-KGC: Domain-Specific KG Construction via LLM-driven Knowledge Dependency Parsing2025-05-30Dependency Parsing is More Parameter-Efficient with Normalization2025-05-26FiLLM -- A Filipino-optimized Large Language Model based on Southeast Asia Large Language Model (SEALLM)2025-05-25CrosGrpsABS: Cross-Attention over Syntactic and Semantic Graphs for Aspect-Based Sentiment Analysis in a Low-Resource Language2025-05-25Semantic-based Unsupervised Framing Analysis (SUFA): A Novel Approach for Computational Framing Analysis2025-05-21Hierarchical Bracketing Encodings for Dependency Parsing as Tagging2025-05-16