Convolutions are competitive with transformers for protein sequence pretraining.

Cell systems Pub Date : 2024-03-20 Epub Date: 2024-02-29 DOI:10.1016/j.cels.2024.01.008
Kevin K Yang, Nicolo Fusi, Alex X Lu
{"title":"Convolutions are competitive with transformers for protein sequence pretraining.","authors":"Kevin K Yang, Nicolo Fusi, Alex X Lu","doi":"10.1016/j.cels.2024.01.008","DOIUrl":null,"url":null,"abstract":"<p><p>Pretrained protein sequence language models have been shown to improve the performance of many prediction tasks and are now routinely integrated into bioinformatics tools. However, these models largely rely on the transformer architecture, which scales quadratically with sequence length in both run-time and memory. Therefore, state-of-the-art models have limitations on sequence length. To address this limitation, we investigated whether convolutional neural network (CNN) architectures, which scale linearly with sequence length, could be as effective as transformers in protein language models. With masked language model pretraining, CNNs are competitive with, and occasionally superior to, transformers across downstream applications while maintaining strong performance on sequences longer than those allowed in the current state-of-the-art transformer models. Our work suggests that computational efficiency can be improved without sacrificing performance, simply by using a CNN architecture instead of a transformer, and emphasizes the importance of disentangling pretraining task and model architecture. A record of this paper's transparent peer review process is included in the supplemental information.</p>","PeriodicalId":93929,"journal":{"name":"Cell systems","volume":" ","pages":"286-294.e2"},"PeriodicalIF":0.0000,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cell systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1016/j.cels.2024.01.008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/2/29 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Pretrained protein sequence language models have been shown to improve the performance of many prediction tasks and are now routinely integrated into bioinformatics tools. However, these models largely rely on the transformer architecture, which scales quadratically with sequence length in both run-time and memory. Therefore, state-of-the-art models have limitations on sequence length. To address this limitation, we investigated whether convolutional neural network (CNN) architectures, which scale linearly with sequence length, could be as effective as transformers in protein language models. With masked language model pretraining, CNNs are competitive with, and occasionally superior to, transformers across downstream applications while maintaining strong performance on sequences longer than those allowed in the current state-of-the-art transformer models. Our work suggests that computational efficiency can be improved without sacrificing performance, simply by using a CNN architecture instead of a transformer, and emphasizes the importance of disentangling pretraining task and model architecture. A record of this paper's transparent peer review process is included in the supplemental information.

在蛋白质序列预训练方面,卷积与变换器具有竞争性。
预训练的蛋白质序列语言模型已被证明能提高许多预测任务的性能,现在已被例行集成到生物信息学工具中。然而,这些模型在很大程度上依赖于转换器架构,而转换器架构在运行时间和内存方面都与序列长度成二次方关系。因此,最先进的模型对序列长度有限制。为了解决这一局限性,我们研究了卷积神经网络(CNN)架构是否能在蛋白质语言模型中与转换器一样有效,因为后者与序列长度成线性关系。通过掩码语言模型预训练,CNN 在下游应用中可与转换器竞争,有时甚至优于转换器,同时在比当前最先进的转换器模型所允许的序列长度更长的序列上保持强劲的性能。我们的工作表明,只需使用 CNN 架构而不是转换器,就能在不牺牲性能的情况下提高计算效率,并强调了将预训练任务和模型架构分开的重要性。本文的同行评审过程透明,其记录包含在补充信息中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信