TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Neural Relation Extraction via Inner-Sentence Noise Reduct...

Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning

Tianyi Liu, Xinsong Zhang, Wanhao Zhou, Weijia Jia

2018-08-21EMNLP 2018 10Relation ExtractionRelationship Extraction (Distant Supervised)Knowledge Base CompletionTransfer Learning
PaperPDF

Abstract

Extracting relations is critical for knowledge base completion and construction in which distant supervised methods are widely used to extract relational facts automatically with the existing knowledge bases. However, the automatically constructed datasets comprise amounts of low-quality sentences containing noisy words, which is neglected by current distant supervised methods resulting in unacceptable precisions. To mitigate this problem, we propose a novel word-level distant supervised approach for relation extraction. We first build Sub-Tree Parse(STP) to remove noisy words that are irrelevant to relations. Then we construct a neural network inputting the sub-tree while applying the entity-wise attention to identify the important semantic features of relational words in each instance. To make our model more robust against noisy words, we initialize our network with a priori knowledge learned from the relevant task of entity classification by transfer learning. We conduct extensive experiments using the corpora of New York Times(NYT) and Freebase. Experiments show that our approach is effective and improves the area of Precision/Recall(PR) from 0.35 to 0.39 over the state-of-the-art work.

Results

TaskDatasetMetricValueModel
Relation ExtractionNew York Times CorpusAUC0.39BiGRU+WLA+EWA
Relation ExtractionNew York Times CorpusAverage Precision0.39BiGRU+WLA+EWA
Relation ExtractionNew York Times CorpusAUC0.39BGRU-SET
Relation ExtractionNew York Times CorpusAverage Precision0.39BGRU-SET
Relationship Extraction (Distant Supervised)New York Times CorpusAUC0.39BiGRU+WLA+EWA
Relationship Extraction (Distant Supervised)New York Times CorpusAverage Precision0.39BiGRU+WLA+EWA
Relationship Extraction (Distant Supervised)New York Times CorpusAUC0.39BGRU-SET
Relationship Extraction (Distant Supervised)New York Times CorpusAverage Precision0.39BGRU-SET

Related Papers

RaMen: Multi-Strategy Multi-Modal Learning for Bundle Construction2025-07-18Disentangling coincident cell events using deep transfer learning and compressive sensing2025-07-17Best Practices for Large-Scale, Pixel-Wise Crop Mapping and Transfer Learning Workflows2025-07-16Robust-Multi-Task Gradient Boosting2025-07-15Calibrated and Robust Foundation Models for Vision-Language and Medical Image Tasks Under Distribution Shift2025-07-12The Bayesian Approach to Continual Learning: An Overview2025-07-11DocIE@XLLM25: In-Context Learning for Information Extraction using Fully Synthetic Demonstrations2025-07-08Contrastive and Transfer Learning for Effective Audio Fingerprinting through a Real-World Evaluation Protocol2025-07-08