DLKcat cannot predict meaningful k cat values for mutants and unfamiliar enzymes.

IF 2.5 Q3 BIOCHEMICAL RESEARCH METHODS
Biology Methods and Protocols Pub Date : 2024-08-24 eCollection Date: 2024-01-01 DOI:10.1093/biomethods/bpae061
Alexander Kroll, Martin J Lercher
{"title":"DLKcat cannot predict meaningful <i>k</i> <sub>cat</sub> values for mutants and unfamiliar enzymes.","authors":"Alexander Kroll, Martin J Lercher","doi":"10.1093/biomethods/bpae061","DOIUrl":null,"url":null,"abstract":"<p><p>The recently published DLKcat model, a deep learning approach for predicting enzyme turnover numbers (<i>k</i> <sub>cat</sub>), claims to enable high-throughput <i>k</i> <sub>cat</sub> predictions for metabolic enzymes from any organism and to capture <i>k</i> <sub>cat</sub> changes for mutated enzymes. Here, we critically evaluate these claims. We show that for enzymes with <60% sequence identity to the training data DLKcat predictions become worse than simply assuming a constant average <i>k</i> <sub>cat</sub> value for all reactions. Furthermore, DLKcat's ability to predict mutation effects is much weaker than implied, capturing none of the experimentally observed variation across mutants not included in the training data. These findings highlight significant limitations in DLKcat's generalizability and its practical utility for predicting <i>k</i> <sub>cat</sub> values for novel enzyme families or mutants, which are crucial applications in fields such as metabolic modeling.</p>","PeriodicalId":36528,"journal":{"name":"Biology Methods and Protocols","volume":null,"pages":null},"PeriodicalIF":2.5000,"publicationDate":"2024-08-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11427335/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biology Methods and Protocols","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/biomethods/bpae061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

The recently published DLKcat model, a deep learning approach for predicting enzyme turnover numbers (k cat), claims to enable high-throughput k cat predictions for metabolic enzymes from any organism and to capture k cat changes for mutated enzymes. Here, we critically evaluate these claims. We show that for enzymes with <60% sequence identity to the training data DLKcat predictions become worse than simply assuming a constant average k cat value for all reactions. Furthermore, DLKcat's ability to predict mutation effects is much weaker than implied, capturing none of the experimentally observed variation across mutants not included in the training data. These findings highlight significant limitations in DLKcat's generalizability and its practical utility for predicting k cat values for novel enzyme families or mutants, which are crucial applications in fields such as metabolic modeling.

DLKcat 无法预测突变体和陌生酶的 k cat 值。
最近发表的 DLKcat 模型是一种预测酶转化率(k cat)的深度学习方法,它声称能对任何生物体的代谢酶进行高通量 k cat 预测,并能捕捉突变酶的 k cat 变化。在此,我们对这些说法进行了严格的评估。我们发现,对于所有反应都有 k cat 值的酶来说,DLKcat 可以预测它们的 k cat 值。此外,DLKcat 预测突变效应的能力比所暗示的要弱得多,它捕捉不到实验观察到的未包含在训练数据中的突变体之间的变化。这些发现凸显了 DLKcat 的普适性及其在预测新型酶家族或突变体的 k cat 值方面的实用性存在重大局限,而这正是代谢建模等领域的关键应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Biology Methods and Protocols
Biology Methods and Protocols Agricultural and Biological Sciences-Agricultural and Biological Sciences (all)
CiteScore
3.80
自引率
2.80%
发文量
28
审稿时长
19 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信