Description
SCCL, or Supporting Clustering with Contrastive Learning, is a framework to leverage contrastive learning to promote better separation in unsupervised clustering. It combines the top-down clustering with the bottom-up instance-wise contrastive learning to achieve better inter-cluster distance and intra-cluster distance. During training, we jointly optimize a clustering loss over the original data instances and an instance-wise contrastive loss over the associated augmented pairs.
Papers Using This Method
Mitigating Catastrophic Forgetting in Task-Incremental Continual Learning with Adaptive Classification Criterion2023-05-20Cluster-Level Contrastive Learning for Emotion Recognition in Conversations2023-02-07Sentiment Analysis of Online Travel Reviews Based on Capsule Network and Sentiment Lexicon2022-06-05Supporting Clustering with Contrastive Learning2021-03-24