TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Handwritten Mathematical Expression Recognition via Attent...

Handwritten Mathematical Expression Recognition via Attention Aggregation based Bi-directional Mutual Learning

Xiaohang Bian, Bo Qin, Xiaozhe Xin, Jianwu Li, Xuefeng Su, Yanfeng Wang

2021-12-07Handwritten Mathmatical Expression RecognitionData AugmentationTransfer Learning
PaperPDFCode(official)

Abstract

Handwritten mathematical expression recognition aims to automatically generate LaTeX sequences from given images. Currently, attention-based encoder-decoder models are widely used in this task. They typically generate target sequences in a left-to-right (L2R) manner, leaving the right-to-left (R2L) contexts unexploited. In this paper, we propose an Attention aggregation based Bi-directional Mutual learning Network (ABM) which consists of one shared encoder and two parallel inverse decoders (L2R and R2L). The two decoders are enhanced via mutual distillation, which involves one-to-one knowledge transfer at each training step, making full use of the complementary information from two inverse directions. Moreover, in order to deal with mathematical symbols in diverse scales, an Attention Aggregation Module (AAM) is proposed to effectively integrate multi-scale coverage attentions. Notably, in the inference phase, given that the model already learns knowledge from two inverse directions, we only use the L2R branch for inference, keeping the original parameter size and inference speed. Extensive experiments demonstrate that our proposed approach achieves the recognition accuracy of 56.85 % on CROHME 2014, 52.92 % on CROHME 2016, and 53.96 % on CROHME 2019 without data augmentation and model ensembling, substantially outperforming the state-of-the-art methods. The source code is available in https://github.com/XH-B/ABM.

Results

TaskDatasetMetricValueModel
Handwritten Mathmatical Expression RecognitionCROHME 2016ExpRate52.92ABM
Handwritten Mathmatical Expression RecognitionHME100KExpRate65.93ABM
Handwritten Mathmatical Expression RecognitionCROHME 2019ExpRate53.96ABM
Handwritten Mathmatical Expression RecognitionCROHME 2014ExpRate56.85ABM

Related Papers

RaMen: Multi-Strategy Multi-Modal Learning for Bundle Construction2025-07-18Overview of the TalentCLEF 2025: Skill and Job Title Intelligence for Human Capital Management2025-07-17Pixel Perfect MegaMed: A Megapixel-Scale Vision-Language Foundation Model for Generating High Resolution Medical Images2025-07-17Disentangling coincident cell events using deep transfer learning and compressive sensing2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Best Practices for Large-Scale, Pixel-Wise Crop Mapping and Transfer Learning Workflows2025-07-16Data Augmentation in Time Series Forecasting through Inverted Framework2025-07-15Robust-Multi-Task Gradient Boosting2025-07-15