Question Directed Graph Attention Network for Numerical Reasoning over Text
Kunlong Chen, Weidi Xu, Xingyi Cheng, Zou Xiaochuan, Yuyu Zhang, Le Song, Taifeng Wang, Yuan Qi, Wei Chu
Abstract
Numerical reasoning over texts, such as addition, subtraction, sorting and counting, is a challenging machine reading comprehension task, since it requires both natural language understanding and arithmetic computation. To address this challenge, we propose a heterogeneous graph representation for the context of the passage and question needed for such reasoning, and design a question directed graph attention network to drive multi-step numerical reasoning over this context graph. The code link is at: https://github.com/emnlp2020qdgat/QDGAT
Results
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Question Answering | DROP Test | F1 | 88.38 | QDGAT (ensemble) |
Related Papers
From Roots to Rewards: Dynamic Tree Reasoning with RL2025-07-17Enter the Mind Palace: Reasoning and Planning for Long-term Active Embodied Question Answering2025-07-17Vision-and-Language Training Helps Deploy Taxonomic Knowledge but Does Not Fundamentally Alter It2025-07-17City-VLM: Towards Multidomain Perception Scene Understanding via Multimodal Incomplete Learning2025-07-17Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16Is This Just Fantasy? Language Model Representations Reflect Human Judgments of Event Plausibility2025-07-16Catching Bid-rigging Cartels with Graph Attention Neural Networks2025-07-16Warehouse Spatial Question Answering with LLM Agent2025-07-14