Yuanyuan Li, Shuai Lu, Peter Mathé, Sergei V. Pereverzev
{"title":"具有$$\\text {ReLU}^k$$激活函数的两层网络:巴伦空间和导数逼近","authors":"Yuanyuan Li, Shuai Lu, Peter Mathé, Sergei V. Pereverzev","doi":"10.1007/s00211-023-01384-6","DOIUrl":null,"url":null,"abstract":"<p>We investigate the use of two-layer networks with the rectified power unit, which is called the <span>\\(\\text {ReLU}^k\\)</span> activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the <span>\\(\\text {ReLU}^k\\)</span> activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach.</p>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2023-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Two-layer networks with the $$\\\\text {ReLU}^k$$ activation function: Barron spaces and derivative approximation\",\"authors\":\"Yuanyuan Li, Shuai Lu, Peter Mathé, Sergei V. Pereverzev\",\"doi\":\"10.1007/s00211-023-01384-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>We investigate the use of two-layer networks with the rectified power unit, which is called the <span>\\\\(\\\\text {ReLU}^k\\\\)</span> activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the <span>\\\\(\\\\text {ReLU}^k\\\\)</span> activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach.</p>\",\"PeriodicalId\":2,\"journal\":{\"name\":\"ACS Applied Bio Materials\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2023-11-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Bio Materials\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s00211-023-01384-6\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATERIALS SCIENCE, BIOMATERIALS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s00211-023-01384-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
Two-layer networks with the $$\text {ReLU}^k$$ activation function: Barron spaces and derivative approximation
We investigate the use of two-layer networks with the rectified power unit, which is called the \(\text {ReLU}^k\) activation function, for function and derivative approximation. By extending and calibrating the corresponding Barron space, we show that two-layer networks with the \(\text {ReLU}^k\) activation function are well-designed to simultaneously approximate an unknown function and its derivatives. When the measurement is noisy, we propose a Tikhonov type regularization method, and provide error bounds when the regularization parameter is chosen appropriately. Several numerical examples support the efficiency of the proposed approach.