TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Neighborhood-Enhanced Supervised Contrastive Learning for ...

Neighborhood-Enhanced Supervised Contrastive Learning for Collaborative Filtering

Peijie Sun, Le Wu, Kun Zhang, Xiangzhi Chen, Meng Wang

2024-02-18Data AugmentationCollaborative FilteringContrastive LearningRecommendation Systems
PaperPDFCode

Abstract

While effective in recommendation tasks, collaborative filtering (CF) techniques face the challenge of data sparsity. Researchers have begun leveraging contrastive learning to introduce additional self-supervised signals to address this. However, this approach often unintentionally distances the target user/item from their collaborative neighbors, limiting its efficacy. In response, we propose a solution that treats the collaborative neighbors of the anchor node as positive samples within the final objective loss function. This paper focuses on developing two unique supervised contrastive loss functions that effectively combine supervision signals with contrastive loss. We analyze our proposed loss functions through the gradient lens, demonstrating that different positive samples simultaneously influence updating the anchor node's embeddings. These samples' impact depends on their similarities to the anchor node and the negative samples. Using the graph-based collaborative filtering model as our backbone and following the same data augmentation methods as the existing contrastive learning model SGL, we effectively enhance the performance of the recommendation model. Our proposed Neighborhood-Enhanced Supervised Contrastive Loss (NESCL) model substitutes the contrastive loss function in SGL with our novel loss function, showing marked performance improvement. On three real-world datasets, Yelp2018, Gowalla, and Amazon-Book, our model surpasses the original SGL by 10.09%, 7.09%, and 35.36% on NDCG@20, respectively.

Results

TaskDatasetMetricValueModel
Recommendation SystemsGowallaRecall@200.1917NESCL
Recommendation SystemsGowallanDCG@200.1617NESCL
Recommendation SystemsYelp2018NDCG@200.0611NESCL
Recommendation SystemsYelp2018Recall@200.0743NESCL
Recommendation SystemsAmazon-BookRecall@200.0624NESCL
Recommendation SystemsAmazon-BooknDCG@200.0513NESCL
Collaborative FilteringGowallaNDCG@200.1617NESCL
Collaborative FilteringGowallaRecall@200.1917NESCL
Collaborative FilteringYelp2018NDCG@200.0611NESCL
Collaborative FilteringYelp2018Recall@200.0743NESCL

Related Papers

IP2: Entity-Guided Interest Probing for Personalized News Recommendation2025-07-18A Reproducibility Study of Product-side Fairness in Bundle Recommendation2025-07-18Overview of the TalentCLEF 2025: Skill and Job Title Intelligence for Human Capital Management2025-07-17Pixel Perfect MegaMed: A Megapixel-Scale Vision-Language Foundation Model for Generating High Resolution Medical Images2025-07-17SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17SemCSE: Semantic Contrastive Sentence Embeddings Using LLM-Generated Summaries For Scientific Abstracts2025-07-17HapticCap: A Multimodal Dataset and Task for Understanding User Experience of Vibration Haptic Signals2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16