TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Stable Cluster Discrimination for Deep Clustering

Stable Cluster Discrimination for Deep Clustering

Qi Qian

2023-11-24ICCV 2023 1Deep ClusteringRepresentation LearningImage ClusteringClusteringUnsupervised Image Classification
PaperPDFCode(official)

Abstract

Deep clustering can optimize representations of instances (i.e., representation learning) and explore the inherent data distribution (i.e., clustering) simultaneously, which demonstrates a superior performance over conventional clustering methods with given features. However, the coupled objective implies a trivial solution that all instances collapse to the uniform features. To tackle the challenge, a two-stage training strategy is developed for decoupling, where it introduces an additional pre-training stage for representation learning and then fine-tunes the obtained model for clustering. Meanwhile, one-stage methods are developed mainly for representation learning rather than clustering, where various constraints for cluster assignments are designed to avoid collapsing explicitly. Despite the success of these methods, an appropriate learning objective tailored for deep clustering has not been investigated sufficiently. In this work, we first show that the prevalent discrimination task in supervised learning is unstable for one-stage clustering due to the lack of ground-truth labels and positive instances for certain clusters in each mini-batch. To mitigate the issue, a novel stable cluster discrimination (SeCu) task is proposed and a new hardness-aware clustering criterion can be obtained accordingly. Moreover, a global entropy constraint for cluster assignments is studied with efficient optimization. Extensive experiments are conducted on benchmark data sets and ImageNet. SeCu achieves state-of-the-art performance on all of them, which demonstrates the effectiveness of one-stage deep clustering. Code is available at \url{https://github.com/idstcv/SeCu}.

Results

TaskDatasetMetricValueModel
Image ClusteringCIFAR-10ARI0.857SeCu
Image ClusteringCIFAR-10Accuracy0.93SeCu
Image ClusteringCIFAR-10NMI0.861SeCu
Image ClusteringSTL-10ARI0.693SeCu
Image ClusteringSTL-10Accuracy0.836SeCu
Image ClusteringSTL-10NMI0.733SeCu
Image ClusteringImageNetARI41.9SeCu
Image ClusteringImageNetAccuracy53.5SeCu
Image ClusteringImageNetNMI79.4SeCu
Image ClusteringImageNetARI35.6CoKe
Image ClusteringImageNetAccuracy47.6CoKe
Image ClusteringImageNetNMI76.2CoKe
Image ClassificationCIFAR-10Accuracy93SeCu
Image ClassificationCIFAR-20Accuracy55.2SeCu

Related Papers

Touch in the Wild: Learning Fine-Grained Manipulation with a Portable Visuo-Tactile Gripper2025-07-20Tri-Learn Graph Fusion Network for Attributed Graph Clustering2025-07-18Spectral Bellman Method: Unifying Representation and Exploration in RL2025-07-17Boosting Team Modeling through Tempo-Relational Representation Learning2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Language-Guided Contrastive Audio-Visual Masked Autoencoder with Automatically Generated Audio-Visual-Text Triplets from Videos2025-07-16Ranking Vectors Clustering: Theory and Applications2025-07-16