TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Geometry-Aware Supertagging with Heterogeneous Dynamic Con...

Geometry-Aware Supertagging with Heterogeneous Dynamic Convolutions

Konstantinos Kogkalidis, Michael Moortgat

2022-03-23CCG Supertagging
PaperPDFCode(official)Code

Abstract

The syntactic categories of categorial grammar formalisms are structured units made of smaller, indivisible primitives, bound together by the underlying grammar's category formation rules. In the trending approach of constructive supertagging, neural models are increasingly made aware of the internal category structure, which in turn enables them to more reliably predict rare and out-of-vocabulary categories, with significant implications for grammars previously deemed too complex to find practical use. In this work, we revisit constructive supertagging from a graph-theoretic perspective, and propose a framework based on heterogeneous dynamic graph convolutions aimed at exploiting the distinctive structure of a supertagger's output space. We test our approach on a number of categorial grammar datasets spanning different languages and grammar formalisms, achieving substantial improvements over previous state of the art scores. Code will be made available at https://github.com/konstantinosKokos/dynamic-graph-supertagging

Results

TaskDatasetMetricValueModel
CCG SupertaggingCCGbankAccuracy96.29Heterogeneous Dynamic Convolutions

Related Papers

Something Old, Something New: Grammar-based CCG Parsing with Transformer Models2021-09-21CCG Supertagging as Top-down Tree Generation2021-02-01Supertagging Combinatory Categorial Grammar with Attentive Graph Convolutional Networks2020-10-13Supertagging with CCG primitives2020-07-01Hierarchically-Refined Label Attention Network for Sequence Labeling2019-08-23Probing What Different NLP Tasks Teach Machines about Function Word Comprehension2019-04-25An Empirical Investigation of Global and Local Normalization for Recurrent Neural Sequence Models Using a Continuous Relaxation to Beam Search2019-04-15Language Modeling Teaches You More than Translation Does: Lessons Learned Through Auxiliary Syntactic Task Analysis2018-11-01