Cyclic peptide membrane permeability prediction using deep learning model based on molecular attention transformer.

IF 2.8 Q2 MATHEMATICAL & COMPUTATIONAL BIOLOGY
Frontiers in bioinformatics Pub Date : 2025-03-11 eCollection Date: 2025-01-01 DOI:10.3389/fbinf.2025.1566174
Dawei Jiang, Zixi Chen, Hongli Du
{"title":"Cyclic peptide membrane permeability prediction using deep learning model based on molecular attention transformer.","authors":"Dawei Jiang, Zixi Chen, Hongli Du","doi":"10.3389/fbinf.2025.1566174","DOIUrl":null,"url":null,"abstract":"<p><p>Membrane permeability is a critical bottleneck in the development of cyclic peptide drugs. Experimental membrane permeability testing is costly, and precise <i>in silico</i> prediction tools are scarce. In this study, we developed CPMP (https://github.com/panda1103/CPMP), a cyclic peptide membrane permeability prediction model based on the Molecular Attention Transformer (MAT) frame. The model demonstrated robust predictive performance, achieving determination coefficients (<i>R</i> <sup><i>2</i></sup> ) of 0.67 for PAMPA permeability prediction, and <i>R</i> <sup><i>2</i></sup> values of 0.75, 0.62, and 0.73 for Caco-2, RRCK, and MDCK cell permeability predictions, respectively. Its performance outperforms traditional machine learning methods and graph-based neural network models. In ablation experiments, we validated the effectiveness of each component in the MAT architecture. Additionally, we analyzed the impact of data pre-training and cyclic peptide conformation optimization on model performance.</p>","PeriodicalId":73066,"journal":{"name":"Frontiers in bioinformatics","volume":"5 ","pages":"1566174"},"PeriodicalIF":2.8000,"publicationDate":"2025-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11933047/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in bioinformatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/fbinf.2025.1566174","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Membrane permeability is a critical bottleneck in the development of cyclic peptide drugs. Experimental membrane permeability testing is costly, and precise in silico prediction tools are scarce. In this study, we developed CPMP (https://github.com/panda1103/CPMP), a cyclic peptide membrane permeability prediction model based on the Molecular Attention Transformer (MAT) frame. The model demonstrated robust predictive performance, achieving determination coefficients (R 2 ) of 0.67 for PAMPA permeability prediction, and R 2 values of 0.75, 0.62, and 0.73 for Caco-2, RRCK, and MDCK cell permeability predictions, respectively. Its performance outperforms traditional machine learning methods and graph-based neural network models. In ablation experiments, we validated the effectiveness of each component in the MAT architecture. Additionally, we analyzed the impact of data pre-training and cyclic peptide conformation optimization on model performance.

基于分子注意力转换器的环肽膜透性深度学习模型预测。
膜的通透性是环肽药物开发的关键瓶颈。实验膜透性测试是昂贵的,和精确的计算机预测工具是稀缺的。在这项研究中,我们建立了CPMP (https://github.com/panda1103/CPMP),一个基于分子注意力转换器(MAT)框架的环肽膜通透性预测模型。该模型显示出稳健的预测性能,PAMPA渗透率预测的决定系数(r2)为0.67,Caco-2、RRCK和MDCK细胞渗透率预测的r2值分别为0.75、0.62和0.73。它的性能优于传统的机器学习方法和基于图的神经网络模型。在消融实验中,我们验证了MAT架构中每个组件的有效性。此外,我们还分析了数据预训练和环肽构象优化对模型性能的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
2.60
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信