Mingkun Li, Peng Xu, Chun-Guang Li, Jun Guo
In this paper, we address a highly challenging yet critical task: unsupervised long-term person re-identification with clothes change. Existing unsupervised person re-id methods are mainly designed for short-term scenarios and usually rely on RGB cues so that fail to perceive feature patterns that are independent of the clothes. To crack this bottleneck, we propose a silhouette-driven contrastive learning (SiCL) method, which is designed to learn cross-clothes invariance by integrating both the RGB cues and the silhouette information within a contrastive learning framework. To our knowledge, this is the first tailor-made framework for unsupervised long-term clothes change \reid{}, with superior performance on six benchmark datasets. We conduct extensive experiments to evaluate our proposed SiCL compared to the state-of-the-art unsupervised person reid methods across all the representative datasets. Experimental results demonstrate that our proposed SiCL significantly outperforms other unsupervised re-id methods.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Person Re-Identification | LTCC | Rank-1 | 20.7 | MaskCL |
| Person Re-Identification | LTCC | mAP | 10.1 | MaskCL |
| Person Re-Identification | VC-Clothes | Rank-1 | 71.7 | SiCL |
| Person Re-Identification | VC-Clothes | mAP | 63.9 | SiCL |
| Person Re-Identification | PRCC | Rank-1 | 43.2 | SiCL |
| Person Re-Identification | PRCC | mAP | 55.4 | SiCL |