TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Modeling Label Correlations for Ultra-Fine Entity Typing w...

Modeling Label Correlations for Ultra-Fine Entity Typing with Neural Pairwise Conditional Random Field

Chengyue Jiang, Yong Jiang, Weiqi Wu, Pengjun Xie, Kewei Tu

2022-12-03Vocal Bursts Type PredictionEntity TypingVariational Inference
PaperPDFCode(official)

Abstract

Ultra-fine entity typing (UFET) aims to predict a wide range of type phrases that correctly describe the categories of a given entity mention in a sentence. Most recent works infer each entity type independently, ignoring the correlations between types, e.g., when an entity is inferred as a president, it should also be a politician and a leader. To this end, we use an undirected graphical model called pairwise conditional random field (PCRF) to formulate the UFET problem, in which the type variables are not only unarily influenced by the input but also pairwisely relate to all the other type variables. We use various modern backbones for entity typing to compute unary potentials, and derive pairwise potentials from type phrase representations that both capture prior semantic information and facilitate accelerated inference. We use mean-field variational inference for efficient type inference on very large type sets and unfold it as a neural network module to enable end-to-end training. Experiments on UFET show that the Neural-PCRF consistently outperforms its backbones with little cost and results in a competitive performance against cross-encoder based SOTA while being thousands of times faster. We also find Neural- PCRF effective on a widely used fine-grained entity typing dataset with a smaller type set. We pack Neural-PCRF as a network module that can be plugged onto multi-label type classifiers with ease and release it in https://github.com/modelscope/adaseq/tree/master/examples/NPCRF.

Results

TaskDatasetMetricValueModel
Entity TypingOpen EntityF150.1Prompt + NPCRF (replicated by Adaseq)
Entity TypingOpen EntityF149.3Prompt Learning (replicated by Adaseq))
Entity TypingOpen EntityF147.3RoBERTa-Large + NPCRF (replicated by Adaseq)
Entity TypingOpen EntityF143.8RoBERTa-Large (replicated by Adaseq)

Related Papers

Interpretable Bayesian Tensor Network Kernel Machines with Automatic Rank and Feature Selection2025-07-15Scalable Bayesian Low-Rank Adaptation of Large Language Models via Stochastic Variational Subspace Inference2025-06-26VHU-Net: Variational Hadamard U-Net for Body MRI Bias Field Correction2025-06-23Branching Stein Variational Gradient Descent for sampling multimodal distributions2025-06-16Robust Recursive Fusion of Multiresolution Multispectral Images with Location-Aware Neural Networks2025-06-16Variational Inference with Mixtures of Isotropic Gaussians2025-06-16Robust Filtering -- Novel Statistical Learning and Inference Algorithms with Applications2025-06-13Bayesian Probabilistic Matrix Factorization2025-06-11