TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Empowering Source-Free Domain Adaptation with MLLM-driven ...

Empowering Source-Free Domain Adaptation with MLLM-driven Curriculum Learning

Dongjie Chen, Kartik Patwari, Zhengfeng Lai, Sen-ching Cheung, Chen-Nee Chuah

2024-05-28Instruction FollowingSource-Free Domain AdaptationTransfer LearningUnsupervised Domain AdaptationDomain Adaptation
PaperPDFCode(official)

Abstract

Source-Free Domain Adaptation (SFDA) aims to adapt a pre-trained source model to a target domain using only unlabeled target data. Current SFDA methods face challenges in effectively leveraging pre-trained knowledge and exploiting target domain data. Multimodal Large Language Models (MLLMs) offer remarkable capabilities in understanding visual and textual information, but their applicability to SFDA poses challenges such as instruction-following failures, intensive computational demands, and difficulties in performance measurement prior to adaptation. To alleviate these issues, we propose Reliability-based Curriculum Learning (RCL), a novel framework that integrates multiple MLLMs for knowledge exploitation via pseudo-labeling in SFDA. Our framework incorporates proposed Reliable Knowledge Transfer, Self-correcting and MLLM-guided Knowledge Expansion, and Multi-hot Masking Refinement to progressively exploit unlabeled data in the target domain. RCL achieves state-of-the-art (SOTA) performance on multiple SFDA benchmarks, e.g., $\textbf{+9.4%}$ on DomainNet, demonstrating its effectiveness in enhancing adaptability and robustness without requiring access to source data. Code: https://github.com/Dong-Jie-Chen/RCL.

Results

TaskDatasetMetricValueModel
Domain AdaptationVisDA2017Accuracy93.2RCL
Domain AdaptationOffice-HomeAccuracy90RCL
Domain AdaptationVisDA2017Accuracy93.2RCL
Domain AdaptationOffice-HomeAccuracy90RCL
Domain AdaptationVisDA-2017Accuracy93.2RCL
Unsupervised Domain AdaptationVisDA2017Accuracy93.2RCL
Unsupervised Domain AdaptationOffice-HomeAccuracy90RCL
Source-Free Domain AdaptationVisDA-2017Accuracy93.2RCL

Related Papers

RaMen: Multi-Strategy Multi-Modal Learning for Bundle Construction2025-07-18AnyCap Project: A Unified Framework, Dataset, and Benchmark for Controllable Omni-modal Captioning2025-07-17Disentangling coincident cell events using deep transfer learning and compressive sensing2025-07-17A Privacy-Preserving Semantic-Segmentation Method Using Domain-Adaptation Technique2025-07-17Best Practices for Large-Scale, Pixel-Wise Crop Mapping and Transfer Learning Workflows2025-07-16How Many Instructions Can LLMs Follow at Once?2025-07-15DrafterBench: Benchmarking Large Language Models for Tasks Automation in Civil Engineering2025-07-15Robust-Multi-Task Gradient Boosting2025-07-15