用于蛋白质配体相互作用预测的多尺度拓扑结构-序列转换器

IF 18.8 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Dong Chen, Jian Liu, Guo-Wei Wei
{"title":"用于蛋白质配体相互作用预测的多尺度拓扑结构-序列转换器","authors":"Dong Chen, Jian Liu, Guo-Wei Wei","doi":"10.1038/s42256-024-00855-1","DOIUrl":null,"url":null,"abstract":"Despite the success of pretrained natural language processing (NLP) models in various fields, their application in computational biology has been hindered by their reliance on biological sequences, which ignores vital three-dimensional (3D) structural information incompatible with the sequential architecture of NLP models. Here we present a topological transformer (TopoFormer), which is built by integrating NLP models and a multiscale topology technique, the persistent topological hyperdigraph Laplacian (PTHL), which systematically converts intricate 3D protein–ligand complexes at various spatial scales into an NLP-admissible sequence of topological invariants and homotopic shapes. PTHL systematically transforms intricate 3D protein–ligand complexes into NLP-compatible sequences of topological invariants and shapes, capturing essential interactions across spatial scales. TopoFormer gives rise to exemplary scoring accuracy and excellent performance in ranking, docking and screening tasks in several benchmark datasets. This approach can be utilized to convert general high-dimensional structured data into NLP-compatible sequences, paving the way for broader NLP based research. Transformers show much promise for applications in computational biology, but they rely on sequences, and a challenge is to incorporate 3D structural information. TopoFormer, proposed by Dong Chen et al., combines transformers with a mathematical multiscale topology technique to model 3D protein–ligand complexes, substantially enhancing performance in a range of prediction tasks of interest to drug discovery.","PeriodicalId":48533,"journal":{"name":"Nature Machine Intelligence","volume":"6 7","pages":"799-810"},"PeriodicalIF":18.8000,"publicationDate":"2024-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multiscale topology-enabled structure-to-sequence transformer for protein–ligand interaction predictions\",\"authors\":\"Dong Chen, Jian Liu, Guo-Wei Wei\",\"doi\":\"10.1038/s42256-024-00855-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Despite the success of pretrained natural language processing (NLP) models in various fields, their application in computational biology has been hindered by their reliance on biological sequences, which ignores vital three-dimensional (3D) structural information incompatible with the sequential architecture of NLP models. Here we present a topological transformer (TopoFormer), which is built by integrating NLP models and a multiscale topology technique, the persistent topological hyperdigraph Laplacian (PTHL), which systematically converts intricate 3D protein–ligand complexes at various spatial scales into an NLP-admissible sequence of topological invariants and homotopic shapes. PTHL systematically transforms intricate 3D protein–ligand complexes into NLP-compatible sequences of topological invariants and shapes, capturing essential interactions across spatial scales. TopoFormer gives rise to exemplary scoring accuracy and excellent performance in ranking, docking and screening tasks in several benchmark datasets. This approach can be utilized to convert general high-dimensional structured data into NLP-compatible sequences, paving the way for broader NLP based research. Transformers show much promise for applications in computational biology, but they rely on sequences, and a challenge is to incorporate 3D structural information. TopoFormer, proposed by Dong Chen et al., combines transformers with a mathematical multiscale topology technique to model 3D protein–ligand complexes, substantially enhancing performance in a range of prediction tasks of interest to drug discovery.\",\"PeriodicalId\":48533,\"journal\":{\"name\":\"Nature Machine Intelligence\",\"volume\":\"6 7\",\"pages\":\"799-810\"},\"PeriodicalIF\":18.8000,\"publicationDate\":\"2024-06-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nature Machine Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.nature.com/articles/s42256-024-00855-1\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Machine Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.nature.com/articles/s42256-024-00855-1","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

尽管预训练自然语言处理(NLP)模型在各个领域都取得了成功,但它们在计算生物学中的应用却因依赖生物序列而受到阻碍,因为生物序列忽略了重要的三维(3D)结构信息,与 NLP 模型的序列架构不兼容。在这里,我们介绍一种拓扑变换器(TopoFormer),它是通过整合 NLP 模型和多尺度拓扑技术--持久拓扑超图拉普拉斯(PTHL)--而建立起来的,它能系统地将各种空间尺度上错综复杂的三维蛋白质配体复合物转换成 NLP 允许的拓扑不变式和同位形状序列。PTHL 将复杂的三维蛋白质配体复合物系统地转换成与 NLP 兼容的拓扑不变式和形状序列,捕捉跨空间尺度的基本相互作用。TopoFormer 在多个基准数据集的排序、对接和筛选任务中,具有极高的评分准确性和出色的性能。这种方法可用于将一般的高维结构化数据转换成与 NLP 兼容的序列,为更广泛的基于 NLP 的研究铺平道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Multiscale topology-enabled structure-to-sequence transformer for protein–ligand interaction predictions

Multiscale topology-enabled structure-to-sequence transformer for protein–ligand interaction predictions

Multiscale topology-enabled structure-to-sequence transformer for protein–ligand interaction predictions
Despite the success of pretrained natural language processing (NLP) models in various fields, their application in computational biology has been hindered by their reliance on biological sequences, which ignores vital three-dimensional (3D) structural information incompatible with the sequential architecture of NLP models. Here we present a topological transformer (TopoFormer), which is built by integrating NLP models and a multiscale topology technique, the persistent topological hyperdigraph Laplacian (PTHL), which systematically converts intricate 3D protein–ligand complexes at various spatial scales into an NLP-admissible sequence of topological invariants and homotopic shapes. PTHL systematically transforms intricate 3D protein–ligand complexes into NLP-compatible sequences of topological invariants and shapes, capturing essential interactions across spatial scales. TopoFormer gives rise to exemplary scoring accuracy and excellent performance in ranking, docking and screening tasks in several benchmark datasets. This approach can be utilized to convert general high-dimensional structured data into NLP-compatible sequences, paving the way for broader NLP based research. Transformers show much promise for applications in computational biology, but they rely on sequences, and a challenge is to incorporate 3D structural information. TopoFormer, proposed by Dong Chen et al., combines transformers with a mathematical multiscale topology technique to model 3D protein–ligand complexes, substantially enhancing performance in a range of prediction tasks of interest to drug discovery.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
36.90
自引率
2.10%
发文量
127
期刊介绍: Nature Machine Intelligence is a distinguished publication that presents original research and reviews on various topics in machine learning, robotics, and AI. Our focus extends beyond these fields, exploring their profound impact on other scientific disciplines, as well as societal and industrial aspects. We recognize limitless possibilities wherein machine intelligence can augment human capabilities and knowledge in domains like scientific exploration, healthcare, medical diagnostics, and the creation of safe and sustainable cities, transportation, and agriculture. Simultaneously, we acknowledge the emergence of ethical, social, and legal concerns due to the rapid pace of advancements. To foster interdisciplinary discussions on these far-reaching implications, Nature Machine Intelligence serves as a platform for dialogue facilitated through Comments, News Features, News & Views articles, and Correspondence. Our goal is to encourage a comprehensive examination of these subjects. Similar to all Nature-branded journals, Nature Machine Intelligence operates under the guidance of a team of skilled editors. We adhere to a fair and rigorous peer-review process, ensuring high standards of copy-editing and production, swift publication, and editorial independence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信