TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Improved Universal Sentence Embeddings with Prompt-based C...

Improved Universal Sentence Embeddings with Prompt-based Contrastive Learning and Energy-based Learning

Yuxin Jiang, Linhan Zhang, Wei Wang

2022-03-14AttributeSentence EmbeddingsSemantic Textual SimilarityContrastive LearningSTS
PaperPDFCode(official)

Abstract

Contrastive learning has been demonstrated to be effective in enhancing pre-trained language models (PLMs) to derive superior universal sentence embeddings. However, existing contrastive methods still have two limitations. Firstly, previous works may acquire poor performance under domain shift settings, thus hindering the application of sentence representations in practice. We attribute this low performance to the over-parameterization of PLMs with millions of parameters. To alleviate it, we propose PromCSE (Prompt-based Contrastive Learning for Sentence Embeddings), which only trains small-scale \emph{Soft Prompt} (i.e., a set of trainable vectors) while keeping PLMs fixed. Secondly, the commonly used NT-Xent loss function of contrastive learning does not fully exploit hard negatives in supervised learning settings. To this end, we propose to integrate an Energy-based Hinge loss to enhance the pairwise discriminative power, inspired by the connection between the NT-Xent loss and the Energy-based Learning paradigm. Empirical results on seven standard semantic textual similarity (STS) tasks and a domain-shifted STS task both show the effectiveness of our method compared with the current state-of-the-art sentence embedding models. Our code is publicly avaliable at https://github.com/YJiangcm/PromCSE

Results

TaskDatasetMetricValueModel
Semantic Textual SimilaritySTS14Spearman Correlation0.8381PromCSE-RoBERTa-large (0.355B)
Semantic Textual SimilaritySTS15Spearman Correlation0.8808PromCSE-RoBERTa-large (0.355B)
Semantic Textual SimilaritySICKSpearman Correlation0.8243PromCSE-RoBERTa-large (0.355B)
Semantic Textual SimilaritySTS13Spearman Correlation0.8897PromCSE-RoBERTa-large (0.355B)
Semantic Textual SimilaritySTS BenchmarkSpearman Correlation0.8787PromCSE-RoBERTa-large (0.355B)
Semantic Textual SimilaritySTS12Spearman Correlation0.7956PromCSE-RoBERTa-large (0.355B)
Semantic Textual SimilaritySTS16Spearman Correlation0.8496PromCSE-RoBERTa-large (0.355B)

Related Papers

From Neurons to Semantics: Evaluating Cross-Linguistic Alignment Capabilities of Large Language Models via Neurons Alignment2025-07-20SemCSE: Semantic Contrastive Sentence Embeddings Using LLM-Generated Summaries For Scientific Abstracts2025-07-17HapticCap: A Multimodal Dataset and Task for Understanding User Experience of Vibration Haptic Signals2025-07-17Overview of the TalentCLEF 2025: Skill and Job Title Intelligence for Human Capital Management2025-07-17SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17MGFFD-VLM: Multi-Granularity Prompt Learning for Face Forgery Detection with VLM2025-07-16Non-Adaptive Adversarial Face Generation2025-07-16Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16