TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/ChemBERTa-2: Towards Chemical Foundation Models

ChemBERTa-2: Towards Chemical Foundation Models

Walid Ahmad, Elana Simon, Seyone Chithrananda, Gabriel Grand, Bharath Ramsundar

2022-09-05Molecular Property PredictionSelf-Supervised Learning
PaperPDFCodeCode

Abstract

Large pretrained models such as GPT-3 have had tremendous impact on modern natural language processing by leveraging self-supervised learning to learn salient representations that can be used to readily finetune on a wide variety of downstream tasks. We investigate the possibility of transferring such advances to molecular machine learning by building a chemical foundation model, ChemBERTa-2, using the language of SMILES. While labeled data for molecular prediction tasks is typically scarce, libraries of SMILES strings are readily available. In this work, we build upon ChemBERTa by optimizing the pretraining process. We compare multi-task and self-supervised pretraining by varying hyperparameters and pretraining dataset size, up to 77M compounds from PubChem. To our knowledge, the 77M set constitutes one of the largest datasets used for molecular pretraining to date. We find that with these pretraining improvements, we are competitive with existing state-of-the-art architectures on the MoleculeNet benchmark suite. We analyze the degree to which improvements in pretraining translate to improvement on downstream tasks.

Results

TaskDatasetMetricValueModel
Molecular Property PredictionclintoxMolecules (M)77ChemBERTa-2 (MTR-77M)
Molecular Property PredictionclintoxROC-AUC56.3ChemBERTa-2 (MTR-77M)
Molecular Property PredictionClearanceRMSE48.515ChemBERTa-2 (MTR-77M)
Molecular Property PredictionLipophilicityRMSE0.798ChemBERTa-2 (MTR-77M)
Molecular Property PredictionBBBPROC-AUC72.8ChemBERTa-2 (MTR-77M)
Molecular Property PredictionBACERMSE1.363ChemBERTa-2 (MTR-77M)
Molecular Property PredictionBACEROC-AUC79.9ChemBERTa-2 (MTR-77M)
Molecular Property PredictionESOLRMSE0.889ChemBERTa-2 (MTR-77M)
Atomistic DescriptionclintoxMolecules (M)77ChemBERTa-2 (MTR-77M)
Atomistic DescriptionclintoxROC-AUC56.3ChemBERTa-2 (MTR-77M)
Atomistic DescriptionClearanceRMSE48.515ChemBERTa-2 (MTR-77M)
Atomistic DescriptionLipophilicityRMSE0.798ChemBERTa-2 (MTR-77M)
Atomistic DescriptionBBBPROC-AUC72.8ChemBERTa-2 (MTR-77M)
Atomistic DescriptionBACERMSE1.363ChemBERTa-2 (MTR-77M)
Atomistic DescriptionBACEROC-AUC79.9ChemBERTa-2 (MTR-77M)
Atomistic DescriptionESOLRMSE0.889ChemBERTa-2 (MTR-77M)

Related Papers

A Semi-Supervised Learning Method for the Identification of Bad Exposures in Large Imaging Surveys2025-07-17Self-supervised Learning on Camera Trap Footage Yields a Strong Universal Face Embedder2025-07-14Speech Quality Assessment Model Based on Mixture of Experts: System-Level Performance Enhancement and Utterance-Level Challenge Analysis2025-07-08Acquiring and Adapting Priors for Novel Tasks via Neural Meta-Architectures2025-07-07Combining Graph Neural Networks and Mixed Integer Linear Programming for Molecular Inference under the Two-Layered Model2025-07-05World4Drive: End-to-End Autonomous Driving via Intention-aware Physical Latent World Model2025-07-01ShapeEmbed: a self-supervised learning framework for 2D contour quantification2025-07-01RetFiner: A Vision-Language Refinement Scheme for Retinal Foundation Models2025-06-27