TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/RaSa: Relation and Sensitivity Aware Representation Learni...

RaSa: Relation and Sensitivity Aware Representation Learning for Text-based Person Search

Yang Bai, Min Cao, Daming Gao, Ziqiang Cao, Chen Chen, Zhenfeng Fan, Liqiang Nie, Min Zhang

2023-05-23Person SearchText based Person SearchText based Person Retrieval
PaperPDFCode(official)

Abstract

Text-based person search aims to retrieve the specified person images given a textual description. The key to tackling such a challenging task is to learn powerful multi-modal representations. Towards this, we propose a Relation and Sensitivity aware representation learning method (RaSa), including two novel tasks: Relation-Aware learning (RA) and Sensitivity-Aware learning (SA). For one thing, existing methods cluster representations of all positive pairs without distinction and overlook the noise problem caused by the weak positive pairs where the text and the paired image have noise correspondences, thus leading to overfitting learning. RA offsets the overfitting risk by introducing a novel positive relation detection task (i.e., learning to distinguish strong and weak positive pairs). For another thing, learning invariant representation under data augmentation (i.e., being insensitive to some transformations) is a general practice for improving representation's robustness in existing methods. Beyond that, we encourage the representation to perceive the sensitive transformation by SA (i.e., learning to detect the replaced words), thus promoting the representation's robustness. Experiments demonstrate that RaSa outperforms existing state-of-the-art methods by 6.94%, 4.45% and 15.35% in terms of Rank@1 on CUHK-PEDES, ICFG-PEDES and RSTPReid datasets, respectively. Code is available at: https://github.com/Flame-Chasers/RaSa.

Results

TaskDatasetMetricValueModel
Text based Person RetrievalCUHK-PEDESR@176.51RaSa
Text based Person RetrievalCUHK-PEDESR@1094.25RaSa
Text based Person RetrievalCUHK-PEDESR@590.29RaSa
Text based Person RetrievalCUHK-PEDESmAP69.38RaSa
Text based Person RetrievalICFG-PEDESR@165.28RaSa
Text based Person RetrievalICFG-PEDESR@1085.12RaSa
Text based Person RetrievalICFG-PEDESR@580.4RaSa
Text based Person RetrievalICFG-PEDESmAP41.29RaSa
Text based Person RetrievalRSTPReidR@166.9RaSa
Text based Person RetrievalRSTPReidR@1091.35RaSa
Text based Person RetrievalRSTPReidR@586.5RaSa
Text based Person RetrievalRSTPReidmAP52.31RaSa

Related Papers

SA-Person: Text-Based Person Retrieval with Scene-aware Re-ranking2025-05-30Dynamic Uncertainty Learning with Noisy Correspondence for Text-Based Person Search2025-05-10Uncertainty-Aware Prototype Semantic Decoupling for Text-Based Person Search in Full Images2025-05-06CAMeL: Cross-modality Adaptive Meta-Learning for Text-based Person Retrieval2025-04-26UP-Person: Unified Parameter-Efficient Transfer Learning for Text-based Person Retrieval2025-04-14An Empirical Study of Validating Synthetic Data for Text-Based Person Retrieval2025-03-28SeCap: Self-Calibrating and Adaptive Prompts for Cross-view Person Re-Identification in Aerial-Ground Networks2025-03-10Boosting Weak Positives for Text Based Person Search2025-01-29