TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Progressive Self-Supervised Attention Learning for Aspect-...

Progressive Self-Supervised Attention Learning for Aspect-Level Sentiment Analysis

Jialong Tang, Ziyao Lu, Jinsong Su, Yubin Ge, Linfeng Song, Le Sun, Jiebo Luo

2019-06-04ACL 2019 7Sentiment AnalysisAspect-Based Sentiment Analysis (ABSA)Sentiment Classification
PaperPDFCode(official)

Abstract

In aspect-level sentiment classification (ASC), it is prevalent to equip dominant neural models with attention mechanisms, for the sake of acquiring the importance of each context word on the given aspect. However, such a mechanism tends to excessively focus on a few frequent words with sentiment polarities, while ignoring infrequent ones. In this paper, we propose a progressive self-supervised attention learning approach for neural ASC models, which automatically mines useful attention supervision information from a training corpus to refine attention mechanisms. Specifically, we iteratively conduct sentiment predictions on all training instances. Particularly, at each iteration, the context word with the maximum attention weight is extracted as the one with active/misleading influence on the correct/incorrect prediction of every instance, and then the word itself is masked for subsequent iterations. Finally, we augment the conventional training objective with a regularization term, which enables ASC models to continue equally focusing on the extracted active context words while decreasing weights of those misleading ones. Experimental results on multiple datasets show that our proposed approach yields better attention mechanisms, leading to substantial improvements over the two state-of-the-art neural ASC models. Source code and trained models are available at https://github.com/DeepLearnXMU/PSSAttention.

Results

TaskDatasetMetricValueModel
Sentiment AnalysisSemEval-2014 Task-4Laptop (Acc)77.62TNet-ATT(+AS)
Sentiment AnalysisSemEval-2014 Task-4Mean Acc (Restaurant + Laptop)79.58TNet-ATT(+AS)
Sentiment AnalysisSemEval-2014 Task-4Restaurant (Acc)81.53TNet-ATT(+AS)
Aspect-Based Sentiment Analysis (ABSA)SemEval-2014 Task-4Laptop (Acc)77.62TNet-ATT(+AS)
Aspect-Based Sentiment Analysis (ABSA)SemEval-2014 Task-4Mean Acc (Restaurant + Laptop)79.58TNet-ATT(+AS)
Aspect-Based Sentiment Analysis (ABSA)SemEval-2014 Task-4Restaurant (Acc)81.53TNet-ATT(+AS)

Related Papers

AdaptiSent: Context-Aware Adaptive Attention for Multimodal Aspect-Based Sentiment Analysis2025-07-17AI Wizards at CheckThat! 2025: Enhancing Transformer-Based Embeddings with Sentiment for Subjectivity Detection in News Articles2025-07-15DCR: Quantifying Data Contamination in LLMs Evaluation2025-07-15SentiDrop: A Multi Modal Machine Learning model for Predicting Dropout in Distance Learning2025-07-14GNN-CNN: An Efficient Hybrid Model of Convolutional and Graph Neural Networks for Text Representation2025-07-10FINN-GL: Generalized Mixed-Precision Extensions for FPGA-Accelerated LSTMs2025-06-25Unpacking Generative AI in Education: Computational Modeling of Teacher and Student Perspectives in Social Media Discourse2025-06-19Characterizing Linguistic Shifts in Croatian News via Diachronic Word Embeddings2025-06-16