Behar Baxhaku, Purshottam Narain Agrawal, Shivam Bajpeyi
{"title":"Riemann–Liouville Fractional Integral Type Deep Neural Network Kantorovich Operators","authors":"Behar Baxhaku, Purshottam Narain Agrawal, Shivam Bajpeyi","doi":"10.1007/s40995-024-01729-2","DOIUrl":null,"url":null,"abstract":"<div><p>This paper introduces a novel family of Kantorovich-type deep neural network operators based on Riemann–Liouville fractional integrals. Building upon the work of Costarelli (Math Model Anal 27(4):547–560, 2022) and Sharma and Singh (J Math Anal Appl 533(2):128009, 2024), we investigate the approximation properties of these operators in the spaces <span>\\({{\\mathcal {C}}}({\\mathscr {I}})\\)</span> (the space of all continuous functions on <span>\\({\\mathscr {I}}:=[-1,1]\\)</span>) and <span>\\({\\mathscr {L}}_{{\\mathcalligra {p}}}({\\mathscr {I}})\\)</span> (the space of all <span>\\({\\mathcalligra {p}}\\)</span>-th Lebesgue integrable functions on <span>\\({\\mathscr {I}}\\)</span>, <span>\\(1\\le {\\mathcalligra {p}}<\\infty\\)</span>). We establish point-wise and uniform convergence results for both single and multi-hidden layer networks in the spaces <span>\\({{\\mathcal {C}}}({\\mathscr {I}})\\)</span> and <span>\\({\\mathscr {L}}_{{\\mathcalligra {p}}}({\\mathscr {I}})\\)</span>, <span>\\(1\\le {\\mathcalligra {p}}<\\infty\\)</span>. Our analysis leverages auxiliary approximation results for the single-hidden layer case to derive density theorems for the two-hidden layer and multi-hidden layer scenarios. Finally, we discuss specific examples of sigmoidal activation functions that are compatible with our proposed operators.</p></div>","PeriodicalId":600,"journal":{"name":"Iranian Journal of Science and Technology, Transactions A: Science","volume":"49 3","pages":"711 - 724"},"PeriodicalIF":1.4000,"publicationDate":"2024-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Iranian Journal of Science and Technology, Transactions A: Science","FirstCategoryId":"4","ListUrlMain":"https://link.springer.com/article/10.1007/s40995-024-01729-2","RegionNum":4,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
This paper introduces a novel family of Kantorovich-type deep neural network operators based on Riemann–Liouville fractional integrals. Building upon the work of Costarelli (Math Model Anal 27(4):547–560, 2022) and Sharma and Singh (J Math Anal Appl 533(2):128009, 2024), we investigate the approximation properties of these operators in the spaces \({{\mathcal {C}}}({\mathscr {I}})\) (the space of all continuous functions on \({\mathscr {I}}:=[-1,1]\)) and \({\mathscr {L}}_{{\mathcalligra {p}}}({\mathscr {I}})\) (the space of all \({\mathcalligra {p}}\)-th Lebesgue integrable functions on \({\mathscr {I}}\), \(1\le {\mathcalligra {p}}<\infty\)). We establish point-wise and uniform convergence results for both single and multi-hidden layer networks in the spaces \({{\mathcal {C}}}({\mathscr {I}})\) and \({\mathscr {L}}_{{\mathcalligra {p}}}({\mathscr {I}})\), \(1\le {\mathcalligra {p}}<\infty\). Our analysis leverages auxiliary approximation results for the single-hidden layer case to derive density theorems for the two-hidden layer and multi-hidden layer scenarios. Finally, we discuss specific examples of sigmoidal activation functions that are compatible with our proposed operators.
期刊介绍:
The aim of this journal is to foster the growth of scientific research among Iranian scientists and to provide a medium which brings the fruits of their research to the attention of the world’s scientific community. The journal publishes original research findings – which may be theoretical, experimental or both - reviews, techniques, and comments spanning all subjects in the field of basic sciences, including Physics, Chemistry, Mathematics, Statistics, Biology and Earth Sciences