CTsynther: Contrastive Transformer model for end-to-end retrosynthesis prediction.

IF 3.6 3区 生物学 Q2 BIOCHEMICAL RESEARCH METHODS
Hao Lu, Zhiqiang Wei, Kun Zhang, Xuze Wang, Liaqat Ali, Hao Liu
{"title":"CTsynther: Contrastive Transformer model for end-to-end retrosynthesis prediction.","authors":"Hao Lu, Zhiqiang Wei, Kun Zhang, Xuze Wang, Liaqat Ali, Hao Liu","doi":"10.1109/TCBB.2024.3455381","DOIUrl":null,"url":null,"abstract":"<p><p>Retrosynthesis prediction is a fundamental problem in organic chemistry and drug synthesis. We proposed an end-to-end deep learning model called CTsynther (Contrastive Transformer for single-step retrosynthesis prediction model) that could provide single-step retrosynthesis prediction without external reaction templates or specialized knowledge. The model introduced the concept of contrastive learning in Transformer architecture and employed a contrastive learning language representation model at the SMILES sentence level to enhance model inference by learning similarities and differences between various samples. Mixed global and local attention mechanisms allow the model to capture features and dependencies between different atoms to improve generalization. We further investigated the embedding representations of SMILES learned automatically from the model. Visualization results show that the model could effectively acquire information about identical molecules and improve prediction performance. Experiments showed that the accuracy of retrosynthesis reached 53.5% and 64.4% for with and without reaction types, respectively. The validity of the predicted reactants is improved, showing competitiveness compared with semi-template methods.</p>","PeriodicalId":13344,"journal":{"name":"IEEE/ACM Transactions on Computational Biology and Bioinformatics","volume":"PP ","pages":""},"PeriodicalIF":3.6000,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE/ACM Transactions on Computational Biology and Bioinformatics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/TCBB.2024.3455381","RegionNum":3,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Retrosynthesis prediction is a fundamental problem in organic chemistry and drug synthesis. We proposed an end-to-end deep learning model called CTsynther (Contrastive Transformer for single-step retrosynthesis prediction model) that could provide single-step retrosynthesis prediction without external reaction templates or specialized knowledge. The model introduced the concept of contrastive learning in Transformer architecture and employed a contrastive learning language representation model at the SMILES sentence level to enhance model inference by learning similarities and differences between various samples. Mixed global and local attention mechanisms allow the model to capture features and dependencies between different atoms to improve generalization. We further investigated the embedding representations of SMILES learned automatically from the model. Visualization results show that the model could effectively acquire information about identical molecules and improve prediction performance. Experiments showed that the accuracy of retrosynthesis reached 53.5% and 64.4% for with and without reaction types, respectively. The validity of the predicted reactants is improved, showing competitiveness compared with semi-template methods.

CTsynther:用于端到端逆合成预测的对比变换器模型。
逆合成预测是有机化学和药物合成中的一个基本问题。我们提出了一种名为 CTsynther(Contrastive Transformer for single-step retrosynthesis prediction model)的端到端深度学习模型,无需外部反应模板或专业知识,即可提供单步逆合成预测。该模型在 Transformer 架构中引入了对比学习的概念,并在 SMILES 句子层面采用了对比学习语言表征模型,通过学习不同样本之间的异同来增强模型推理能力。全局和局部混合关注机制使模型能够捕捉不同原子之间的特征和依赖关系,从而提高泛化能力。我们进一步研究了从模型中自动学习到的 SMILES 的嵌入表征。可视化结果表明,该模型能有效获取相同分子的信息,并提高预测性能。实验表明,有反应类型和无反应类型的逆合成准确率分别达到了 53.5% 和 64.4%。与半模板方法相比,预测反应物的有效性得到了提高,显示出了竞争力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.50
自引率
6.70%
发文量
479
审稿时长
3 months
期刊介绍: IEEE/ACM Transactions on Computational Biology and Bioinformatics emphasizes the algorithmic, mathematical, statistical and computational methods that are central in bioinformatics and computational biology; the development and testing of effective computer programs in bioinformatics; the development of biological databases; and important biological results that are obtained from the use of these methods, programs and databases; the emerging field of Systems Biology, where many forms of data are used to create a computer-based model of a complex biological system
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信