TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Dual Student: Breaking the Limits of the Teacher in Semi-s...

Dual Student: Breaking the Limits of the Teacher in Semi-supervised Learning

Zhanghan Ke, Daoye Wang, Qiong Yan, Jimmy Ren, Rynson W. H. Lau

2019-09-03ICCV 2019 10Unsupervised Domain AdaptationSemi-Supervised Image Classification
PaperPDFCodeCode(official)

Abstract

Recently, consistency-based methods have achieved state-of-the-art results in semi-supervised learning (SSL). These methods always involve two roles, an explicit or implicit teacher model and a student model, and penalize predictions under different perturbations by a consistency constraint. However, the weights of these two roles are tightly coupled since the teacher is essentially an exponential moving average (EMA) of the student. In this work, we show that the coupled EMA teacher causes a performance bottleneck. To address this problem, we introduce Dual Student, which replaces the teacher with another student. We also define a novel concept, stable sample, following which a stabilization constraint is designed for our structure to be trainable. Further, we discuss two variants of our method, which produce even higher performance. Extensive experiments show that our method improves the classification performance significantly on several main SSL benchmarks. Specifically, it reduces the error rate of the 13-layer CNN from 16.84% to 12.39% on CIFAR-10 with 1k labels and from 34.10% to 31.56% on CIFAR-100 with 10k labels. In addition, our method also achieves a clear improvement in domain adaptation.

Results

TaskDatasetMetricValueModel
Image ClassificationCIFAR-10, 4000 LabelsPercentage error8.89Dual Student (600)
Image ClassificationCIFAR-10, 2000 LabelsAccuracy89.28Dual Student (600)
Image ClassificationSVHN, 500 LabelsAccuracy96.04Dual Student
Image ClassificationCIFAR-10, 1000 LabelsAccuracy85.83Dual Student (600)
Image Classificationcifar-100, 10000 LabelsPercentage error32.77Dual Student (480)
Image ClassificationSVHN, 250 LabelsAccuracy95.76Dual Student
Semi-Supervised Image ClassificationCIFAR-10, 4000 LabelsPercentage error8.89Dual Student (600)
Semi-Supervised Image ClassificationCIFAR-10, 2000 LabelsAccuracy89.28Dual Student (600)
Semi-Supervised Image ClassificationSVHN, 500 LabelsAccuracy96.04Dual Student
Semi-Supervised Image ClassificationCIFAR-10, 1000 LabelsAccuracy85.83Dual Student (600)
Semi-Supervised Image Classificationcifar-100, 10000 LabelsPercentage error32.77Dual Student (480)
Semi-Supervised Image ClassificationSVHN, 250 LabelsAccuracy95.76Dual Student

Related Papers

CORE-ReID V2: Advancing the Domain Adaptation for Object Re-Identification with Optimized Training and Ensemble Fusion2025-07-04Unlocking Constraints: Source-Free Occlusion-Aware Seamless Segmentation2025-06-26Topology-Aware Modeling for Unsupervised Simulation-to-Reality Point Cloud Recognition2025-06-26Prmpt2Adpt: Prompt-Based Zero-Shot Domain Adaptation for Resource-Constrained Environments2025-06-20MUDAS: Mote-scale Unsupervised Domain Adaptation in Multi-label Sound Classification2025-06-12Customizing Speech Recognition Model with Large Language Model Feedback2025-06-05Diffusion Domain Teacher: Diffusion Guided Domain Adaptive Object Detector2025-06-04ViTSGMM: A Robust Semi-Supervised Image Recognition Network Using Sparse Labels2025-06-04