TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Do We Really Need to Access the Source Data? Source Hypoth...

Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation

Jian Liang, Dapeng Hu, Jiashi Feng

2020-02-20ICML 2020 1Universal Domain AdaptationRepresentation LearningSource-Free Domain AdaptationPartial Domain AdaptationUnsupervised Domain AdaptationDomain Adaptation
PaperPDFCodeCodeCode(official)

Abstract

Unsupervised domain adaptation (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain. Prior UDA methods typically require to access the source data when learning to adapt the model, making them risky and inefficient for decentralized private data. This work tackles a practical setting where only a trained source model is available and investigates how we can effectively utilize such a model without source data to solve UDA problems. We propose a simple yet generic representation learning framework, named \emph{Source HypOthesis Transfer} (SHOT). SHOT freezes the classifier module (hypothesis) of the source model and learns the target-specific feature extraction module by exploiting both information maximization and self-supervised pseudo-labeling to implicitly align representations from the target domains to the source hypothesis. To verify its versatility, we evaluate SHOT in a variety of adaptation cases including closed-set, partial-set, and open-set domain adaptation. Experiments indicate that SHOT yields state-of-the-art results among multiple domain adaptation benchmarks.

Results

TaskDatasetMetricValueModel
Domain AdaptationUSPS-to-MNISTAccuracy98.4SHOT
Domain AdaptationSVHN-to-MNISTAccuracy98.9SHOT
Domain AdaptationOffice-31Average Accuracy88.6SHOT
Domain AdaptationSVNH-to-MNISTAccuracy98.9SHOT
Domain AdaptationVisDA2017Accuracy82.9SHOT
Domain AdaptationMNIST-to-USPSAccuracy98SHOT
Domain AdaptationVisDA-2017Accuracy82.9SHOT
Domain AdaptationOffice-HomeH-Score40.7SHOT-O
Domain AdaptationVisDA2017H-score44SHOT-O
Domain AdaptationDomainNetH-Score32.6SHOT-O
Domain AdaptationOffice-HomeAccuracy (%)78.3SHOT
Universal Domain AdaptationOffice-HomeH-Score40.7SHOT-O
Universal Domain AdaptationVisDA2017H-score44SHOT-O
Universal Domain AdaptationDomainNetH-Score32.6SHOT-O
Source-Free Domain AdaptationVisDA-2017Accuracy82.9SHOT

Related Papers

Touch in the Wild: Learning Fine-Grained Manipulation with a Portable Visuo-Tactile Gripper2025-07-20Spectral Bellman Method: Unifying Representation and Exploration in RL2025-07-17Boosting Team Modeling through Tempo-Relational Representation Learning2025-07-17A Privacy-Preserving Semantic-Segmentation Method Using Domain-Adaptation Technique2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Language-Guided Contrastive Audio-Visual Masked Autoencoder with Automatically Generated Audio-Visual-Text Triplets from Videos2025-07-16A Mixed-Primitive-based Gaussian Splatting Method for Surface Reconstruction2025-07-15