Shengjun Liu , Yu Yu , Ting Zhang , Hanchao Liu , Xinru Liu , Deyu Meng
{"title":"Architectures, variants, and performance of neural operators: A comparative review","authors":"Shengjun Liu , Yu Yu , Ting Zhang , Hanchao Liu , Xinru Liu , Deyu Meng","doi":"10.1016/j.neucom.2025.130518","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, neural operators have emerged as effective alternatives to traditional numerical solvers. They are known for their efficient computation, excellent generalization, and high solving accuracy. Many researchers have shown interest in their design and application. This paper provides a comprehensive summary and analysis of neural operators. We categorize them into three types based on their architecture: deep operator networks (DeepONets), integral kernel operators, and transformer-based neural operators. We then discuss the basic structures and properties of these operator types. Furthermore, we summarize and discuss the various variants and extensions of these three types of neural operators from three directions: (1) operator basis-based neural operator variants; (2) physics-informed neural operator variants; and (3) application of neural operator variants in complex systems. We also analyze the characteristics and performance of different operator methods through numerical experiments. Taking into account these discussions and analyses, we provide perspectives and suggestions regarding the challenges and potential enhancements for different neural operators. This offers valuable guidance and suggestions for the practical application and development of neural operators.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"648 ","pages":"Article 130518"},"PeriodicalIF":5.5000,"publicationDate":"2025-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225011907","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, neural operators have emerged as effective alternatives to traditional numerical solvers. They are known for their efficient computation, excellent generalization, and high solving accuracy. Many researchers have shown interest in their design and application. This paper provides a comprehensive summary and analysis of neural operators. We categorize them into three types based on their architecture: deep operator networks (DeepONets), integral kernel operators, and transformer-based neural operators. We then discuss the basic structures and properties of these operator types. Furthermore, we summarize and discuss the various variants and extensions of these three types of neural operators from three directions: (1) operator basis-based neural operator variants; (2) physics-informed neural operator variants; and (3) application of neural operator variants in complex systems. We also analyze the characteristics and performance of different operator methods through numerical experiments. Taking into account these discussions and analyses, we provide perspectives and suggestions regarding the challenges and potential enhancements for different neural operators. This offers valuable guidance and suggestions for the practical application and development of neural operators.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.