TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/You Only Train Once: A Unified Framework for Both Full-Ref...

You Only Train Once: A Unified Framework for Both Full-Reference and No-Reference Image Quality Assessment

Yi Ke Yun, Weisi Lin

2023-10-14Image Quality AssessmentNo-Reference Image Quality Assessment
PaperPDFCode(official)

Abstract

Although recent efforts in image quality assessment (IQA) have achieved promising performance, there still exists a considerable gap compared to the human visual system (HVS). One significant disparity lies in humans' seamless transition between full reference (FR) and no reference (NR) tasks, whereas existing models are constrained to either FR or NR tasks. This disparity implies the necessity of designing two distinct systems, thereby greatly diminishing the model's versatility. Therefore, our focus lies in unifying FR and NR IQA under a single framework. Specifically, we first employ an encoder to extract multi-level features from input images. Then a Hierarchical Attention (HA) module is proposed as a universal adapter for both FR and NR inputs to model the spatial distortion at each encoder stage. Furthermore, considering that different distortions contaminate encoder stages and damage image semantic meaning differently, a Semantic Distortion Aware (SDA) module is proposed to examine feature correlations between shallow and deep layers of the encoder. By adopting HA and SDA, the proposed network can effectively perform both FR and NR IQA. When our proposed model is independently trained on NR or FR IQA tasks, it outperforms existing models and achieves state-of-the-art performance. Moreover, when trained jointly on NR and FR IQA tasks, it further enhances the performance of NR IQA while achieving on-par performance in the state-of-the-art FR IQA. You only train once to perform both IQA tasks. Code will be released at: https://github.com/BarCodeReader/YOTO.

Results

TaskDatasetMetricValueModel
Image Quality AssessmentKonIQ-10kPLCC0.938UNIQA
Image Quality AssessmentKonIQ-10kSRCC0.926UNIQA
Image Quality AssessmentLIVEPLCC0.986UNIQA
Image Quality AssessmentLIVESRCC0.986UNIQA
Image Quality AssessmentKADID-10kPLCC0.946UNIQA
Image Quality AssessmentKADID-10kSRCC0.944UNIQA
Image Quality AssessmentTID2013PLCC0.956UNIQA
Image Quality AssessmentTID2013SRCC0.953UNIQA
Image Quality AssessmentCSIQPLCC0.97UNIQA
Image Quality AssessmentCSIQSRCC0.964UNIQA
Image Quality AssessmentLIVEPLCC98.6UNIQA
Image Quality AssessmentLIVESRCC98.6UNIQA
No-Reference Image Quality AssessmentLIVEPLCC0.986UNIQA
No-Reference Image Quality AssessmentLIVESRCC0.986UNIQA
No-Reference Image Quality AssessmentKADID-10kPLCC0.946UNIQA
No-Reference Image Quality AssessmentKADID-10kSRCC0.944UNIQA
No-Reference Image Quality AssessmentTID2013PLCC0.956UNIQA
No-Reference Image Quality AssessmentTID2013SRCC0.953UNIQA
No-Reference Image Quality AssessmentCSIQPLCC0.97UNIQA
No-Reference Image Quality AssessmentCSIQSRCC0.964UNIQA
No-Reference Image Quality AssessmentLIVEPLCC98.6UNIQA
No-Reference Image Quality AssessmentLIVESRCC98.6UNIQA

Related Papers

Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21Language Integration in Fine-Tuning Multimodal Large Language Models for Image-Based Regression2025-07-20DeQA-Doc: Adapting DeQA-Score to Document Image Quality Assessment2025-07-17Text-Visual Semantic Constrained AI-Generated Image Quality Assessment2025-07-144KAgent: Agentic Any Image to 4K Super-Resolution2025-07-09FundaQ-8: A Clinically-Inspired Scoring Framework for Automated Fundus Image Quality Assessment2025-06-25MS-IQA: A Multi-Scale Feature Fusion Network for PET/CT Image Quality Assessment2025-06-25Enhanced Dermatology Image Quality Assessment via Cross-Domain Training2025-06-19