TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Semantic Sentence Matching with Densely-connected Recurren...

Semantic Sentence Matching with Densely-connected Recurrent and Co-attentive Information

Seonhoon Kim, Inho Kang, Nojun Kwak

2018-05-29Question AnsweringParaphrase IdentificationNatural Language Inference
PaperPDF

Abstract

Sentence matching is widely used in various natural language tasks such as natural language inference, paraphrase identification, and question answering. For these tasks, understanding logical and semantic relationship between two sentences is required but it is yet challenging. Although attention mechanism is useful to capture the semantic relationship and to properly align the elements of two sentences, previous methods of attention mechanism simply use a summation operation which does not retain original features enough. Inspired by DenseNet, a densely connected convolutional network, we propose a densely-connected co-attentive recurrent neural network, each layer of which uses concatenated information of attentive features as well as hidden features of all the preceding recurrent layers. It enables preserving the original and the co-attentive feature information from the bottommost word embedding layer to the uppermost recurrent layer. To alleviate the problem of an ever-increasing size of feature vectors due to dense concatenation operations, we also propose to use an autoencoder after dense concatenation. We evaluate our proposed architecture on highly competitive benchmark datasets related to sentence matching. Experimental results show that our architecture, which retains recurrent and attentive features, achieves state-of-the-art performances for most of the tasks.

Results

TaskDatasetMetricValueModel
Natural Language InferenceSNLI% Test Accuracy90.1Densely-Connected Recurrent and Co-Attentive Network Ensemble
Natural Language InferenceSNLI% Train Accuracy95Densely-Connected Recurrent and Co-Attentive Network Ensemble
Natural Language InferenceSNLI% Test Accuracy88.9Densely-Connected Recurrent and Co-Attentive Network
Natural Language InferenceSNLI% Train Accuracy93.1Densely-Connected Recurrent and Co-Attentive Network
Natural Language InferenceSNLI% Test Accuracy86.5Densely-Connected Recurrent and Co-Attentive Network (encoder)
Natural Language InferenceSNLI% Train Accuracy91.4Densely-Connected Recurrent and Co-Attentive Network (encoder)

Related Papers

From Roots to Rewards: Dynamic Tree Reasoning with RL2025-07-17Enter the Mind Palace: Reasoning and Planning for Long-term Active Embodied Question Answering2025-07-17Vision-and-Language Training Helps Deploy Taxonomic Knowledge but Does Not Fundamentally Alter It2025-07-17City-VLM: Towards Multidomain Perception Scene Understanding via Multimodal Incomplete Learning2025-07-17Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16Is This Just Fantasy? Language Model Representations Reflect Human Judgments of Event Plausibility2025-07-16LRCTI: A Large Language Model-Based Framework for Multi-Step Evidence Retrieval and Reasoning in Cyber Threat Intelligence Credibility Verification2025-07-15Warehouse Spatial Question Answering with LLM Agent2025-07-14