TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/On Eliciting Syntax from Language Models via Hashing

On Eliciting Syntax from Language Models via Hashing

Yiran Wang, Masao Utiyama

2024-10-05Constituency Grammar Induction
PaperPDFCode(official)

Abstract

Unsupervised parsing, also known as grammar induction, aims to infer syntactic structure from raw text. Recently, binary representation has exhibited remarkable information-preserving capabilities at both lexicon and syntax levels. In this paper, we explore the possibility of leveraging this capability to deduce parsing trees from raw text, relying solely on the implicitly induced grammars within models. To achieve this, we upgrade the bit-level CKY from zero-order to first-order to encode the lexicon and syntax in a unified binary representation space, switch training from supervised to unsupervised under the contrastive hashing framework, and introduce a novel loss function to impose stronger yet balanced alignment signals. Our model shows competitive performance on various datasets, therefore, we claim that our method is effective and efficient enough to acquire high-quality parsing trees from pre-trained language models at a low cost.

Results

TaskDatasetMetricValueModel
Constituency ParsingPTB Diagnostic ECG DatabaseMax F1 (WSJ)64.1Hashing (Parserker 2)
Constituency ParsingPTB Diagnostic ECG DatabaseMean F1 (WSJ)62.4Hashing (Parserker 2)

Related Papers

Improving Unsupervised Constituency Parsing via Maximizing Semantic Information2024-10-03Structural Optimization Ambiguity and Simplicity Bias in Unsupervised Neural Grammar Induction2024-07-23Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale2024-03-13Simple Hardware-Efficient PCFGs with Independent Left and Right Productions2023-10-23Ensemble Distillation for Unsupervised Constituency Parsing2023-10-03Augmenting Transformers with Recursively Composed Multi-grained Representations2023-09-28Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs2022-05-01Fast-R2D2: A Pretrained Recursive Neural Network based on Pruned CKY for Grammar Induction and Text Representation2022-03-01