TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Mol-LLM: Multimodal Generalist Molecular LLM with Improved...

Mol-LLM: Multimodal Generalist Molecular LLM with Improved Graph Utilization

Chanhui Lee, Hanbum Ko, Yuheon Song, Yongjun Jeong, Rodrigo Hormazabal, Sehui Han, Kyunghoon Bae, Sungbin Lim, Sungwoong Kim

2025-02-05Molecular Property PredictionPredictionChemical Reaction PredictionMolecule Captioning
PaperPDF

Abstract

Recent advances in large language models (LLMs) have led to models that tackle diverse molecular tasks, such as chemical reaction prediction and molecular property prediction. Large-scale molecular instruction-tuning datasets have enabled sequence-only (e.g., SMILES or SELFIES) generalist molecular LLMs, and researchers are now exploring multimodal approaches that incorporate molecular structural information for further gains. However, a genuinely multimodal, generalist LLM that covers a broad spectrum of molecular tasks has yet to be fully investigated. We observe that naive next token prediction training ignores graph-structural information, limiting an LLM's ability to exploit molecular graphs. To address this, we propose (i) Molecular structure Preference Optimization (MolPO), which facilitates graph usage by optimizing preferences between pairs of correct and perturbed molecular structures, and (ii) an advanced graph encoder with a tailored pre-training strategy to improve the effect of graph utilization by MolPO. Building on these contributions, we introduce Mol-LLM, the first multimodal generalist model that (a) handles a broad spectrum of molecular tasks among molecular LLMs, (b) explicitly leverages molecular-structure information, and (c) takes advantage of extensive instruction tuning. Mol-LLM attains state-of-the-art or comparable results across the most comprehensive molecular-LLM benchmark-even on out-of-distribution datasets for reaction and property prediction, where it surpasses prior generalist molecular LLMs by a large margin.

Results

TaskDatasetMetricValueModel
Molecule CaptioningChEBI-20BLEU-258.7Mol-LLM (SELFIES)
Molecule CaptioningChEBI-20BLEU-451.5Mol-LLM (SELFIES)
Molecule CaptioningChEBI-20METEOR61.7Mol-LLM (SELFIES)
Molecule CaptioningChEBI-20ROUGE-162.7Mol-LLM (SELFIES)
Molecule CaptioningChEBI-20ROUGE-248.7Mol-LLM (SELFIES)
Molecule CaptioningChEBI-20ROUGE-L57.1Mol-LLM (SELFIES)
Molecule CaptioningChEBI-20BLEU-256Mol-LLM
Molecule CaptioningChEBI-20BLEU-449Mol-LLM
Molecule CaptioningChEBI-20METEOR59.3Mol-LLM
Molecule CaptioningChEBI-20ROUGE-152.4Mol-LLM
Molecule CaptioningChEBI-20ROUGE-237Mol-LLM
Molecule CaptioningChEBI-20ROUGE-L46.7Mol-LLM

Related Papers

Multi-Strategy Improved Snake Optimizer Accelerated CNN-LSTM-Attention-Adaboost for Trajectory Prediction2025-07-21Generative Click-through Rate Prediction with Applications to Search Advertising2025-07-15Conformation-Aware Structure Prediction of Antigen-Recognizing Immune Proteins2025-07-11Foundation models for time series forecasting: Application in conformal prediction2025-07-09Predicting Graph Structure via Adapted Flux Balance Analysis2025-07-08Speech Quality Assessment Model Based on Mixture of Experts: System-Level Performance Enhancement and Utterance-Level Challenge Analysis2025-07-08A Wireless Foundation Model for Multi-Task Prediction2025-07-08Acquiring and Adapting Priors for Novel Tasks via Neural Meta-Architectures2025-07-07