VeGA: A Versatile Generative Architecture for Bioactive Molecules across Multiple Therapeutic Targets.

IF 5.3 2区 化学 Q1 CHEMISTRY, MEDICINAL
Pietro Delre,Antonio Lavecchia
{"title":"VeGA: A Versatile Generative Architecture for Bioactive Molecules across Multiple Therapeutic Targets.","authors":"Pietro Delre,Antonio Lavecchia","doi":"10.1021/acs.jcim.5c01606","DOIUrl":null,"url":null,"abstract":"In this paper, we present VeGA, a lightweight, decoder-only Transformer model for de novo molecular design. VeGA balances a streamlined architecture with robust generative performance, making it highly efficient and well-suited for resource-limited environments. Pretrained on ChEMBL, the model demonstrates strong performance against cutting-edge approaches, achieving high validity (96.6%) and novelty (93.6%), ranking among the top performers in the MOSES benchmark. The model's main strength lies in target-specific fine-tuning under challenging, data-scarce conditions. In a rigorous, leakage-safe evaluation across five pharmacological targets against state-of-the-art models (S4, R4), VeGA proved to be a powerful \"explorer\" that consistently generated the most novel molecules while maintaining a strong balance between discovery performance and chemical realism. This capability is particularly evident in the extremely low-data scenario of mTORC1, where VeGA achieved top-tier results. As a case study, VeGA was applied to the Farnesoid X receptor (FXR), generating novel compounds with validated binding potential through molecular docking. The model is available as an open-access platform to support medicinal chemists in designing novel, target-specific chemotypes (https://github.com/piedelre93/VeGA-for-de-novo-design). Future developments will focus on incorporating conditioning strategies for multiobjective optimization and integrating experimental in vitro validation workflows.","PeriodicalId":44,"journal":{"name":"Journal of Chemical Information and Modeling ","volume":"76 1","pages":""},"PeriodicalIF":5.3000,"publicationDate":"2025-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Chemical Information and Modeling ","FirstCategoryId":"92","ListUrlMain":"https://doi.org/10.1021/acs.jcim.5c01606","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MEDICINAL","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we present VeGA, a lightweight, decoder-only Transformer model for de novo molecular design. VeGA balances a streamlined architecture with robust generative performance, making it highly efficient and well-suited for resource-limited environments. Pretrained on ChEMBL, the model demonstrates strong performance against cutting-edge approaches, achieving high validity (96.6%) and novelty (93.6%), ranking among the top performers in the MOSES benchmark. The model's main strength lies in target-specific fine-tuning under challenging, data-scarce conditions. In a rigorous, leakage-safe evaluation across five pharmacological targets against state-of-the-art models (S4, R4), VeGA proved to be a powerful "explorer" that consistently generated the most novel molecules while maintaining a strong balance between discovery performance and chemical realism. This capability is particularly evident in the extremely low-data scenario of mTORC1, where VeGA achieved top-tier results. As a case study, VeGA was applied to the Farnesoid X receptor (FXR), generating novel compounds with validated binding potential through molecular docking. The model is available as an open-access platform to support medicinal chemists in designing novel, target-specific chemotypes (https://github.com/piedelre93/VeGA-for-de-novo-design). Future developments will focus on incorporating conditioning strategies for multiobjective optimization and integrating experimental in vitro validation workflows.
VeGA:跨多种治疗靶点的生物活性分子的多功能生成结构。
在本文中,我们提出了VeGA,一个轻量级的,只有解码器的Transformer模型,用于从头分子设计。VeGA平衡了流线型架构与强大的生成性能,使其高效,非常适合资源有限的环境。在ChEMBL上进行预训练后,该模型对尖端方法表现出强大的性能,达到了高效度(96.6%)和新新性(93.6%),在MOSES基准中名列前茅。该模型的主要优势在于在具有挑战性和数据稀缺的条件下进行针对特定目标的微调。在针对最先进的模型(S4, R4)对5个药理靶点进行严格的泄漏安全评估后,VeGA被证明是一个强大的“探索者”,它始终如一地生成最新颖的分子,同时在发现性能和化学真实性之间保持良好的平衡。这种能力在mTORC1的极低数据场景中尤为明显,VeGA在其中取得了顶级的结果。作为案例研究,VeGA应用于Farnesoid X受体(FXR),通过分子对接产生具有有效结合电位的新化合物。该模型可作为一个开放获取平台,支持药物化学家设计新的靶向特异性化学型(https://github.com/piedelre93/VeGA-for-de-novo-design)。未来的发展将集中于整合多目标优化的调节策略和整合实验体外验证工作流程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
9.80
自引率
10.70%
发文量
529
审稿时长
1.4 months
期刊介绍: The Journal of Chemical Information and Modeling publishes papers reporting new methodology and/or important applications in the fields of chemical informatics and molecular modeling. Specific topics include the representation and computer-based searching of chemical databases, molecular modeling, computer-aided molecular design of new materials, catalysts, or ligands, development of new computational methods or efficient algorithms for chemical software, and biopharmaceutical chemistry including analyses of biological activity and other issues related to drug discovery. Astute chemists, computer scientists, and information specialists look to this monthly’s insightful research studies, programming innovations, and software reviews to keep current with advances in this integral, multidisciplinary field. As a subscriber you’ll stay abreast of database search systems, use of graph theory in chemical problems, substructure search systems, pattern recognition and clustering, analysis of chemical and physical data, molecular modeling, graphics and natural language interfaces, bibliometric and citation analysis, and synthesis design and reactions databases.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信