Scott T. Miller , John F. Lindner , Anshul Choudhary , Sudeshna Sinha , William L. Ditto
{"title":"基于物理的机器学习的数据和维度缩放","authors":"Scott T. Miller , John F. Lindner , Anshul Choudhary , Sudeshna Sinha , William L. Ditto","doi":"10.1016/j.csfx.2020.100046","DOIUrl":null,"url":null,"abstract":"<div><p>We quantify how incorporating physics into neural network design can significantly improve the learning and forecasting of dynamical systems, even nonlinear systems of many dimensions. We train conventional and Hamiltonian neural networks on increasingly difficult dynamical systems and compute their forecasting errors as the number of training data and number of system dimensions vary. A map-building perspective elucidates the superiority of Hamiltonian neural networks. The results clarify the critical relation among data, dimension, and neural network learning performance.</p></div>","PeriodicalId":37147,"journal":{"name":"Chaos, Solitons and Fractals: X","volume":"5 ","pages":"Article 100046"},"PeriodicalIF":0.0000,"publicationDate":"2020-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.csfx.2020.100046","citationCount":"11","resultStr":"{\"title\":\"The scaling of physics-informed machine learning with data and dimensions\",\"authors\":\"Scott T. Miller , John F. Lindner , Anshul Choudhary , Sudeshna Sinha , William L. Ditto\",\"doi\":\"10.1016/j.csfx.2020.100046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>We quantify how incorporating physics into neural network design can significantly improve the learning and forecasting of dynamical systems, even nonlinear systems of many dimensions. We train conventional and Hamiltonian neural networks on increasingly difficult dynamical systems and compute their forecasting errors as the number of training data and number of system dimensions vary. A map-building perspective elucidates the superiority of Hamiltonian neural networks. The results clarify the critical relation among data, dimension, and neural network learning performance.</p></div>\",\"PeriodicalId\":37147,\"journal\":{\"name\":\"Chaos, Solitons and Fractals: X\",\"volume\":\"5 \",\"pages\":\"Article 100046\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/j.csfx.2020.100046\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chaos, Solitons and Fractals: X\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2590054420300270\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chaos, Solitons and Fractals: X","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590054420300270","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
The scaling of physics-informed machine learning with data and dimensions
We quantify how incorporating physics into neural network design can significantly improve the learning and forecasting of dynamical systems, even nonlinear systems of many dimensions. We train conventional and Hamiltonian neural networks on increasingly difficult dynamical systems and compute their forecasting errors as the number of training data and number of system dimensions vary. A map-building perspective elucidates the superiority of Hamiltonian neural networks. The results clarify the critical relation among data, dimension, and neural network learning performance.