Simulating 500 million years of evolution with a language model

IF 44.7 1区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES
Science Pub Date : 2025-01-16 DOI:10.1126/science.ads0018
Thomas Hayes, Roshan Rao, Halil Akin, Nicholas J. Sofroniew, Deniz Oktay, Zeming Lin, Robert Verkuil, Vincent Q. Tran, Jonathan Deaton, Marius Wiggert, Rohil Badkundri, Irhum Shafkat, Jun Gong, Alexander Derry, Raul S. Molina, Neil Thomas, Yousuf A. Khan, Chetan Mishra, Carolyn Kim, Liam J. Bartie, Matthew Nemeth, Patrick D. Hsu, Tom Sercu, Salvatore Candido, Alexander Rives
{"title":"Simulating 500 million years of evolution with a language model","authors":"Thomas Hayes,&nbsp;Roshan Rao,&nbsp;Halil Akin,&nbsp;Nicholas J. Sofroniew,&nbsp;Deniz Oktay,&nbsp;Zeming Lin,&nbsp;Robert Verkuil,&nbsp;Vincent Q. Tran,&nbsp;Jonathan Deaton,&nbsp;Marius Wiggert,&nbsp;Rohil Badkundri,&nbsp;Irhum Shafkat,&nbsp;Jun Gong,&nbsp;Alexander Derry,&nbsp;Raul S. Molina,&nbsp;Neil Thomas,&nbsp;Yousuf A. Khan,&nbsp;Chetan Mishra,&nbsp;Carolyn Kim,&nbsp;Liam J. Bartie,&nbsp;Matthew Nemeth,&nbsp;Patrick D. Hsu,&nbsp;Tom Sercu,&nbsp;Salvatore Candido,&nbsp;Alexander Rives","doi":"10.1126/science.ads0018","DOIUrl":null,"url":null,"abstract":"<div >More than 3 billion years of evolution have produced an image of biology encoded into the space of natural proteins. Here, we show that language models trained at scale on evolutionary data can generate functional proteins that are far away from known proteins. We present ESM3, a frontier multimodal generative language model that reasons over the sequence, structure, and function of proteins. ESM3 can follow complex prompts combining its modalities and is highly responsive to alignment to improve its fidelity. We have prompted ESM3 to generate fluorescent proteins. Among the generations that we synthesized, we found a bright fluorescent protein at a far distance (58% sequence identity) from known fluorescent proteins, which we estimate is equivalent to simulating 500 million years of evolution.</div>","PeriodicalId":21678,"journal":{"name":"Science","volume":"387 6736","pages":""},"PeriodicalIF":44.7000,"publicationDate":"2025-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science","FirstCategoryId":"103","ListUrlMain":"https://www.science.org/doi/10.1126/science.ads0018","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

More than 3 billion years of evolution have produced an image of biology encoded into the space of natural proteins. Here, we show that language models trained at scale on evolutionary data can generate functional proteins that are far away from known proteins. We present ESM3, a frontier multimodal generative language model that reasons over the sequence, structure, and function of proteins. ESM3 can follow complex prompts combining its modalities and is highly responsive to alignment to improve its fidelity. We have prompted ESM3 to generate fluorescent proteins. Among the generations that we synthesized, we found a bright fluorescent protein at a far distance (58% sequence identity) from known fluorescent proteins, which we estimate is equivalent to simulating 500 million years of evolution.
用语言模型模拟5亿年的进化
30多亿年的进化已经产生了一种编码在天然蛋白质空间中的生物图像。在这里,我们展示了在进化数据上进行大规模训练的语言模型可以生成与已知蛋白质相距甚远的功能蛋白质。我们提出了ESM3,一个前沿的多模态生成语言模型,可以对蛋白质的序列、结构和功能进行推理。ESM3可以遵循复杂的提示结合其模式,并高度响应校准,以提高其保真度。我们已经促使ESM3产生荧光蛋白。在我们合成的世代中,我们发现了一个与已知荧光蛋白距离很远的明亮荧光蛋白(58%的序列同源性),我们估计这相当于模拟了5亿年的进化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Science
Science 综合性期刊-综合性期刊
CiteScore
61.10
自引率
0.90%
发文量
0
审稿时长
2.1 months
期刊介绍: Science is a leading outlet for scientific news, commentary, and cutting-edge research. Through its print and online incarnations, Science reaches an estimated worldwide readership of more than one million. Science’s authorship is global too, and its articles consistently rank among the world's most cited research. Science serves as a forum for discussion of important issues related to the advancement of science by publishing material on which a consensus has been reached as well as including the presentation of minority or conflicting points of view. Accordingly, all articles published in Science—including editorials, news and comment, and book reviews—are signed and reflect the individual views of the authors and not official points of view adopted by AAAS or the institutions with which the authors are affiliated. Science seeks to publish those papers that are most influential in their fields or across fields and that will significantly advance scientific understanding. Selected papers should present novel and broadly important data, syntheses, or concepts. They should merit recognition by the wider scientific community and general public provided by publication in Science, beyond that provided by specialty journals. Science welcomes submissions from all fields of science and from any source. The editors are committed to the prompt evaluation and publication of submitted papers while upholding high standards that support reproducibility of published research. Science is published weekly; selected papers are published online ahead of print.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信