{"title":"融合领域知识与微调大语言模型增强分子性质预测。","authors":"Liangxu Xie,Yingdi Jin,Lei Xu,Shan Chang,Xiaojun Xu","doi":"10.1021/acs.jctc.5c00605","DOIUrl":null,"url":null,"abstract":"Although large language models (LLMs) have flourished in various scientific applications, their applications in the specific task of molecular property prediction have not reached a satisfactory level, even for the specific chemistry LLMs. This work addresses a highly crucial and significant challenge existing in the field of drug discovery: accurately predicting the molecular properties by effectively leveraging LLMs enhanced with profound domain knowledge. We propose a Knowledge-Fused Large Language Model for dual-Modality (KFLM2) learning for molecular property prediction. The aim is to utilize the capabilities of advanced LLMs, strengthened with specialized knowledge in the field of drug discovery. We identified DeepSeek-R1-Distill-Qwen-1.5B as the optimal base model from three DeepSeek-R1 distilled LLMs and one chemistry LLM named ChemDFM, by fine-tuning with the ZINC and ChEMBL datasets. We obtained the SMILES embeddings from the fine-tuned model and subsequently integrated the embeddings with the molecular graph to leverage complementary information for predicting molecular properties. Finally, we trained the hybrid neural network on the combined dual modality inputs and predicted the molecular properties. Through benchmarking on regression and classification tasks, our proposed method can obtain higher prediction performance for nine out of ten datasets in the downstream regression and classification tasks. Visualization of the output of hidden layers indicates that the combination of the embedding with the molecular graph can offer complementary information to further improve the prediction accuracy compared with either the LLM embedding or the molecular graph inputs. Larger models do not inherently guarantee superior performance; instead, their effectiveness hinges on our ability to leverage relevant knowledge from both pretraining and fine-tuning. Implementing LLMs with domain knowledge would be a rational approach to making precise predictions that could potentially revolutionize the process of drug development and discovery.","PeriodicalId":45,"journal":{"name":"Journal of Chemical Theory and Computation","volume":"4 1","pages":""},"PeriodicalIF":5.7000,"publicationDate":"2025-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fusing Domain Knowledge with a Fine-Tuned Large Language Model for Enhanced Molecular Property Prediction.\",\"authors\":\"Liangxu Xie,Yingdi Jin,Lei Xu,Shan Chang,Xiaojun Xu\",\"doi\":\"10.1021/acs.jctc.5c00605\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Although large language models (LLMs) have flourished in various scientific applications, their applications in the specific task of molecular property prediction have not reached a satisfactory level, even for the specific chemistry LLMs. This work addresses a highly crucial and significant challenge existing in the field of drug discovery: accurately predicting the molecular properties by effectively leveraging LLMs enhanced with profound domain knowledge. We propose a Knowledge-Fused Large Language Model for dual-Modality (KFLM2) learning for molecular property prediction. The aim is to utilize the capabilities of advanced LLMs, strengthened with specialized knowledge in the field of drug discovery. We identified DeepSeek-R1-Distill-Qwen-1.5B as the optimal base model from three DeepSeek-R1 distilled LLMs and one chemistry LLM named ChemDFM, by fine-tuning with the ZINC and ChEMBL datasets. We obtained the SMILES embeddings from the fine-tuned model and subsequently integrated the embeddings with the molecular graph to leverage complementary information for predicting molecular properties. Finally, we trained the hybrid neural network on the combined dual modality inputs and predicted the molecular properties. Through benchmarking on regression and classification tasks, our proposed method can obtain higher prediction performance for nine out of ten datasets in the downstream regression and classification tasks. Visualization of the output of hidden layers indicates that the combination of the embedding with the molecular graph can offer complementary information to further improve the prediction accuracy compared with either the LLM embedding or the molecular graph inputs. Larger models do not inherently guarantee superior performance; instead, their effectiveness hinges on our ability to leverage relevant knowledge from both pretraining and fine-tuning. Implementing LLMs with domain knowledge would be a rational approach to making precise predictions that could potentially revolutionize the process of drug development and discovery.\",\"PeriodicalId\":45,\"journal\":{\"name\":\"Journal of Chemical Theory and Computation\",\"volume\":\"4 1\",\"pages\":\"\"},\"PeriodicalIF\":5.7000,\"publicationDate\":\"2025-07-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Chemical Theory and Computation\",\"FirstCategoryId\":\"92\",\"ListUrlMain\":\"https://doi.org/10.1021/acs.jctc.5c00605\",\"RegionNum\":1,\"RegionCategory\":\"化学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CHEMISTRY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Chemical Theory and Computation","FirstCategoryId":"92","ListUrlMain":"https://doi.org/10.1021/acs.jctc.5c00605","RegionNum":1,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
Fusing Domain Knowledge with a Fine-Tuned Large Language Model for Enhanced Molecular Property Prediction.
Although large language models (LLMs) have flourished in various scientific applications, their applications in the specific task of molecular property prediction have not reached a satisfactory level, even for the specific chemistry LLMs. This work addresses a highly crucial and significant challenge existing in the field of drug discovery: accurately predicting the molecular properties by effectively leveraging LLMs enhanced with profound domain knowledge. We propose a Knowledge-Fused Large Language Model for dual-Modality (KFLM2) learning for molecular property prediction. The aim is to utilize the capabilities of advanced LLMs, strengthened with specialized knowledge in the field of drug discovery. We identified DeepSeek-R1-Distill-Qwen-1.5B as the optimal base model from three DeepSeek-R1 distilled LLMs and one chemistry LLM named ChemDFM, by fine-tuning with the ZINC and ChEMBL datasets. We obtained the SMILES embeddings from the fine-tuned model and subsequently integrated the embeddings with the molecular graph to leverage complementary information for predicting molecular properties. Finally, we trained the hybrid neural network on the combined dual modality inputs and predicted the molecular properties. Through benchmarking on regression and classification tasks, our proposed method can obtain higher prediction performance for nine out of ten datasets in the downstream regression and classification tasks. Visualization of the output of hidden layers indicates that the combination of the embedding with the molecular graph can offer complementary information to further improve the prediction accuracy compared with either the LLM embedding or the molecular graph inputs. Larger models do not inherently guarantee superior performance; instead, their effectiveness hinges on our ability to leverage relevant knowledge from both pretraining and fine-tuning. Implementing LLMs with domain knowledge would be a rational approach to making precise predictions that could potentially revolutionize the process of drug development and discovery.
期刊介绍:
The Journal of Chemical Theory and Computation invites new and original contributions with the understanding that, if accepted, they will not be published elsewhere. Papers reporting new theories, methodology, and/or important applications in quantum electronic structure, molecular dynamics, and statistical mechanics are appropriate for submission to this Journal. Specific topics include advances in or applications of ab initio quantum mechanics, density functional theory, design and properties of new materials, surface science, Monte Carlo simulations, solvation models, QM/MM calculations, biomolecular structure prediction, and molecular dynamics in the broadest sense including gas-phase dynamics, ab initio dynamics, biomolecular dynamics, and protein folding. The Journal does not consider papers that are straightforward applications of known methods including DFT and molecular dynamics. The Journal favors submissions that include advances in theory or methodology with applications to compelling problems.