TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/NFLAT: Non-Flat-Lattice Transformer for Chinese Named Enti...

NFLAT: Non-Flat-Lattice Transformer for Chinese Named Entity Recognition

Shuang Wu, Xiaoning Song, ZhenHua Feng, Xiao-Jun Wu

2022-05-12named-entity-recognitionNamed Entity RecognitionChinese Named Entity RecognitionNERNamed Entity Recognition (NER)
PaperPDFCode(official)

Abstract

Recently, Flat-LAttice Transformer (FLAT) has achieved great success in Chinese Named Entity Recognition (NER). FLAT performs lexical enhancement by constructing flat lattices, which mitigates the difficulties posed by blurred word boundaries and the lack of word semantics. In FLAT, the positions of starting and ending characters are used to connect a matching word. However, this method is likely to match more words when dealing with long texts, resulting in long input sequences. Therefore, it significantly increases the memory and computational costs of the self-attention module. To deal with this issue, we advocate a novel lexical enhancement method, InterFormer, that effectively reduces the amount of computational and memory costs by constructing non-flat lattices. Furthermore, with InterFormer as the backbone, we implement NFLAT for Chinese NER. NFLAT decouples lexicon fusion and context feature encoding. Compared with FLAT, it reduces unnecessary attention calculations in "word-character" and "word-word". This reduces the memory usage by about 50% and can use more extensive lexicons or higher batches for network training. The experimental results obtained on several well-known benchmarks demonstrate the superiority of the proposed method over the state-of-the-art hybrid (character-word) models.

Results

TaskDatasetMetricValueModel
Named Entity Recognition (NER)Weibo NERF161.94NFLAT
Named Entity Recognition (NER)Weibo NERPrecision59.1NFLAT
Named Entity Recognition (NER)Weibo NERRecall63.76NFLAT
Named Entity Recognition (NER)MSRAF194.55NFLAT
Named Entity Recognition (NER)MSRAPrecision94.92NFLAT
Named Entity Recognition (NER)MSRARecall94.19NFLAT
Named Entity Recognition (NER)Resume NERF195.58NFLAT
Named Entity Recognition (NER)Resume NERPrecision95.63NFLAT
Named Entity Recognition (NER)Resume NERRecall95.52NFLAT
Named Entity Recognition (NER)OntoNotes 4F177.21NFLAT
Named Entity Recognition (NER)OntoNotes 4Precision75.17NFLAT
Named Entity Recognition (NER)OntoNotes 4Recall79.37NFLAT

Related Papers

Flippi: End To End GenAI Assistant for E-Commerce2025-07-08Selecting and Merging: Towards Adaptable and Scalable Named Entity Recognition with Large Language Models2025-06-28Improving Named Entity Transcription with Contextual LLM-based Revision2025-06-12Better Semi-supervised Learning for Multi-domain ASR Through Incremental Retraining and Data Filtering2025-06-05Dissecting Bias in LLMs: A Mechanistic Interpretability Perspective2025-06-05Efficient Data Selection for Domain Adaptation of ASR Using Pseudo-Labels and Multi-Stage Filtering2025-06-04EL4NER: Ensemble Learning for Named Entity Recognition via Multiple Small-Parameter Large Language Models2025-05-29Label-Guided In-Context Learning for Named Entity Recognition2025-05-29