Fine-tuning Handwriting Recognition systems with Temporal Dropout
Edgard Chammas, Chafic Mokbel
2021-01-31Handwriting Recognition
Abstract
This paper introduces a novel method to fine-tune handwriting recognition systems based on Recurrent Neural Networks (RNN). Long Short-Term Memory (LSTM) networks are good at modeling long sequences but they tend to overfit over time. To improve the system's ability to model sequences, we propose to drop information at random positions in the sequence. We call our approach Temporal Dropout (TD). We apply TD at the image level as well to internal network representation. We show that TD improves the results on two different datasets. Our method outperforms previous state-of-the-art on Rodrigo dataset.
Related Papers
A Transformer Based Handwriting Recognition System Jointly Using Online and Offline Features2025-06-25Creating a Historical Migration Dataset from Finnish Church Records, 1800-19202025-06-09DeepFRC: An End-to-End Deep Learning Model for Functional Registration and Classification2025-01-30Making History Readable2024-11-26Learning based Ge'ez character handwritten recognition2024-11-20Handwriting Recognition in Historical Documents with Multimodal LLM2024-10-31MIRAGE: Multimodal Identification and Recognition of Annotations in Indian General Prescriptions2024-10-13Boosting CNN-based Handwriting Recognition Systems with Learnable Relaxation Labeling2024-09-09