TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Upcycling Models under Domain and Category Shift

Upcycling Models under Domain and Category Shift

Sanqing Qu, Tianpei Zou, Florian Roehrbein, Cewu Lu, Guang Chen, DaCheng Tao, Changjun Jiang

2023-03-13CVPR 2023 1Universal Domain AdaptationSource-Free Domain AdaptationClusteringUnsupervised Domain AdaptationDomain Adaptation
PaperPDFCode(official)CodeCode

Abstract

Deep neural networks (DNNs) often perform poorly in the presence of domain shift and category shift. How to upcycle DNNs and adapt them to the target task remains an important open problem. Unsupervised Domain Adaptation (UDA), especially recently proposed Source-free Domain Adaptation (SFDA), has become a promising technology to address this issue. Nevertheless, existing SFDA methods require that the source domain and target domain share the same label space, consequently being only applicable to the vanilla closed-set setting. In this paper, we take one step further and explore the Source-free Universal Domain Adaptation (SF-UniDA). The goal is to identify "known" data samples under both domain and category shift, and reject those "unknown" data samples (not present in source classes), with only the knowledge from standard pre-trained source model. To this end, we introduce an innovative global and local clustering learning technique (GLC). Specifically, we design a novel, adaptive one-vs-all global clustering algorithm to achieve the distinction across different target classes and introduce a local k-NN clustering strategy to alleviate negative transfer. We examine the superiority of our GLC on multiple benchmarks with different category shift scenarios, including partial-set, open-set, and open-partial-set DA. Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8\% on the VisDA benchmark. The code is available at https://github.com/ispc-lab/GLC.

Results

TaskDatasetMetricValueModel
Domain AdaptationOffice-31H-score87.8GLC
Domain AdaptationOffice-HomeH-Score75.6GLC
Domain AdaptationVisDA2017H-score73.1GLC
Domain AdaptationDomainNetH-Score55.1GLC
Universal Domain AdaptationOffice-31H-score87.8GLC
Universal Domain AdaptationOffice-HomeH-Score75.6GLC
Universal Domain AdaptationVisDA2017H-score73.1GLC
Universal Domain AdaptationDomainNetH-Score55.1GLC

Related Papers

Tri-Learn Graph Fusion Network for Attributed Graph Clustering2025-07-18A Privacy-Preserving Semantic-Segmentation Method Using Domain-Adaptation Technique2025-07-17Ranking Vectors Clustering: Theory and Applications2025-07-16Domain Borders Are There to Be Crossed With Federated Few-Shot Adaptation2025-07-14Car Object Counting and Position Estimation via Extension of the CLIP-EBC Framework2025-07-11An Offline Mobile Conversational Agent for Mental Health Support: Learning from Emotional Dialogues and Psychological Texts with Student-Centered Evaluation2025-07-11The Bayesian Approach to Continual Learning: An Overview2025-07-11Doodle Your Keypoints: Sketch-Based Few-Shot Keypoint Detection2025-07-10