TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/LERT: A Linguistically-motivated Pre-trained Language Model

LERT: A Linguistically-motivated Pre-trained Language Model

Yiming Cui, Wanxiang Che, Shijin Wang, Ting Liu

2022-11-10Stock Market PredictionLanguage Modelling
PaperPDFCode(official)

Abstract

Pre-trained Language Model (PLM) has become a representative foundation model in the natural language processing field. Most PLMs are trained with linguistic-agnostic pre-training tasks on the surface form of the text, such as the masked language model (MLM). To further empower the PLMs with richer linguistic features, in this paper, we aim to propose a simple but effective way to learn linguistic features for pre-trained language models. We propose LERT, a pre-trained language model that is trained on three types of linguistic features along with the original MLM pre-training task, using a linguistically-informed pre-training (LIP) strategy. We carried out extensive experiments on ten Chinese NLU tasks, and the experimental results show that LERT could bring significant improvements over various comparable baselines. Furthermore, we also conduct analytical experiments in various linguistic aspects, and the results prove that the design of LERT is valid and effective. Resources are available at https://github.com/ymcui/LERT

Results

TaskDatasetMetricValueModel
Stock Market PredictionAstockAccuray66.36Chinese Lert Large (News+Factors)
Stock Market PredictionAstockF1-score66.16Chinese Lert Large (News+Factors)
Stock Market PredictionAstockPrecision66.4Chinese Lert Large (News+Factors)
Stock Market PredictionAstockRecall66.69Chinese Lert Large (News+Factors)
Stock Market PredictionAstockAccuray64.37Chinese Lert Large (News)
Stock Market PredictionAstockF1-score64.3Chinese Lert Large (News)
Stock Market PredictionAstockPrecision64.34Chinese Lert Large (News)
Stock Market PredictionAstockRecall64.31Chinese Lert Large (News)
Stock Trend PredictionAstockAccuray66.36Chinese Lert Large (News+Factors)
Stock Trend PredictionAstockF1-score66.16Chinese Lert Large (News+Factors)
Stock Trend PredictionAstockPrecision66.4Chinese Lert Large (News+Factors)
Stock Trend PredictionAstockRecall66.69Chinese Lert Large (News+Factors)
Stock Trend PredictionAstockAccuray64.37Chinese Lert Large (News)
Stock Trend PredictionAstockF1-score64.3Chinese Lert Large (News)
Stock Trend PredictionAstockPrecision64.34Chinese Lert Large (News)
Stock Trend PredictionAstockRecall64.31Chinese Lert Large (News)

Related Papers

Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21Making Language Model a Hierarchical Classifier and Generator2025-07-17VisionThink: Smart and Efficient Vision Language Model via Reinforcement Learning2025-07-17The Generative Energy Arena (GEA): Incorporating Energy Awareness in Large Language Model (LLM) Human Evaluations2025-07-17Inverse Reinforcement Learning Meets Large Language Model Post-Training: Basics, Advances, and Opportunities2025-07-17Assay2Mol: large language model-based drug design using BioAssay context2025-07-16Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16InstructFLIP: Exploring Unified Vision-Language Model for Face Anti-spoofing2025-07-16