TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Learning Smooth Representation for Unsupervised Domain Ada...

Learning Smooth Representation for Unsupervised Domain Adaptation

Guanyu Cai, Lianghua He, Mengchu Zhou, Hesham Alhumade, Die Hu

2019-05-26Unsupervised Domain AdaptationDomain Adaptation
PaperPDFCode(official)

Abstract

Typical adversarial-training-based unsupervised domain adaptation methods are vulnerable when the source and target datasets are highly-complex or exhibit a large discrepancy between their data distributions. Recently, several Lipschitz-constraint-based methods have been explored. The satisfaction of Lipschitz continuity guarantees a remarkable performance on a target domain. However, they lack a mathematical analysis of why a Lipschitz constraint is beneficial to unsupervised domain adaptation and usually perform poorly on large-scale datasets. In this paper, we take the principle of utilizing a Lipschitz constraint further by discussing how it affects the error bound of unsupervised domain adaptation. A connection between them is built and an illustration of how Lipschitzness reduces the error bound is presented. A \textbf{local smooth discrepancy} is defined to measure Lipschitzness of a target distribution in a pointwise way. When constructing a deep end-to-end model, to ensure the effectiveness and stability of unsupervised domain adaptation, three critical factors are considered in our proposed optimization strategy, i.e., the sample amount of a target domain, dimension and batchsize of samples. Experimental results demonstrate that our model performs well on several standard benchmarks. Our ablation study shows that the sample amount of a target domain, the dimension and batchsize of samples indeed greatly impact Lipschitz-constraint-based methods' ability to handle large-scale datasets. Code is available at https://github.com/CuthbertCai/SRDA.

Results

TaskDatasetMetricValueModel
Domain AdaptationUSPS-to-MNISTAccuracy95.03SRDA (RAN)
Domain AdaptationOffice-31Average Accuracy73.5SRDA (RAN)
Domain AdaptationSYNSIG-to-GTSRBAccuracy93.61SRDA (RAN)
Domain AdaptationSVNH-to-MNISTAccuracy98.91SRDA (RAN)
Domain AdaptationMNIST-to-USPSAccuracy94.76SRDA (RAN)

Related Papers

A Privacy-Preserving Semantic-Segmentation Method Using Domain-Adaptation Technique2025-07-17Domain Borders Are There to Be Crossed With Federated Few-Shot Adaptation2025-07-14An Offline Mobile Conversational Agent for Mental Health Support: Learning from Emotional Dialogues and Psychological Texts with Student-Centered Evaluation2025-07-11The Bayesian Approach to Continual Learning: An Overview2025-07-11Doodle Your Keypoints: Sketch-Based Few-Shot Keypoint Detection2025-07-10YOLO-APD: Enhancing YOLOv8 for Robust Pedestrian Detection on Complex Road Geometries2025-07-07CORE-ReID V2: Advancing the Domain Adaptation for Object Re-Identification with Optimized Training and Ensemble Fusion2025-07-04Underwater Monocular Metric Depth Estimation: Real-World Benchmarks and Synthetic Fine-Tuning2025-07-02