TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/Dynamic Memory Network

Dynamic Memory Network

GeneralIntroduced 200019 papers
Source Paper

Description

A Dynamic Memory Network is a neural network architecture which processes input sequences and questions, forms episodic memories, and generates relevant answers. Questions trigger an iterative attention process which allows the model to condition its attention on the inputs and the result of previous iterations. These results are then reasoned over in a hierarchical recurrent sequence model to generate answers.

The DMN consists of a number of modules:

  • Input Module: The input module encodes raw text inputs from the task into distributed vector representations. The input takes forms like a sentence, a long story, a movie review and so on.
  • Question Module: The question module encodes the question of the task into a distributed vector representation. For question answering, the question may be a sentence such as "Where did the author first fly?". The representation is fed into the episodic memory module, and forms the basis, or initial state, upon which the episodic memory module iterates.
  • Episodic Memory Module: Given a collection of input representations, the episodic memory module chooses which parts of the inputs to focus on through the attention mechanism. It then produces a ”memory” vector representation taking into account the question as well as the previous memory. Each iteration provides the module with newly relevant information about the input. In other words, the module has the ability to retrieve new information, in the form of input representations, which were thought to be irrelevant in previous iterations.
  • Answer Module: The answer module generates an answer from the final memory vector of the memory module.

Papers Using This Method

Pattern-Matching Dynamic Memory Network for Dual-Mode Traffic Prediction2024-08-12LVOS: A Benchmark for Long-term Video Object Segmentation2022-11-18Learning Quality-aware Dynamic Memory for Video Object Segmentation2022-07-16Memory-Based Semantic Parsing2021-09-07Video Object Segmentation With Dynamic Memory Networks and Adaptive Object Alignment2021-01-01Dual Dynamic Memory Network for End-to-End Multi-turn Task-oriented Dialog Systems2020-12-01Contextualize Knowledge Bases with Transformer for End-to-end Task-Oriented Dialogue Systems2020-10-12Doctor2Vec: Dynamic Doctor Representation Learning for Clinical Trial Recruitment2019-11-23Meta-Learning with Dynamic-Memory-Based Prototypical Network for Few-Shot Event Detection2019-10-25Entropy-Enhanced Multimodal Attention Model for Scene-Aware Dialogue Generation2019-08-22Visual Tracking via Dynamic Memory Networks2019-07-12Exploiting Contextual Information via Dynamic Memory Network for Event Detection2018-10-03Relational dynamic memory networks2018-08-10Motion-Appearance Co-Memory Networks for Video Question Answering2018-03-29Learning Dynamic Memory Networks for Object Tracking2018-03-20Integrating Order Information and Event Relation for Script Event Prediction2017-09-01Ask Me Even More: Dynamic Memory Tensor Networks (Extended Model)2017-03-11Dynamic Memory Networks for Visual and Textual Question Answering2016-03-04Ask Me Anything: Dynamic Memory Networks for Natural Language Processing2015-06-24