{"title":"Self-Pumped Optical Neural Networks","authors":"Y. Owechko","doi":"10.1364/optcomp.1989.md4","DOIUrl":null,"url":null,"abstract":"Neural network models for artificial intelligence offer an approach fundamentally different from conventional symbolic approaches, but the merits of the two paradigms cannot be fairly compared until neural network models with large numbers of ”neurons” are implemented. Despite the attractiveness of neural networks for computing applications which involve adaptation and learning, most of the published demonstrations of neural network technology have involved relatively small numbers of ”neurons”. One reason for this is the poor match between conventional electronic serial or coarse-grained multiple-processor computers and the massive parallelism and communication requirements of neural network models. The self-pumped optical neural network (SPONN) described here is a fine-grained optical architecture which features massive parallelism and a much greater degree of interconnectivity than bus-oriented or hypercube electronic architectures. SPONN is potentially capable of implementing neural networks consisting of 105-106 neurons with 109-1010 interconnections. The mapping of neural network models onto the architecture occurs naturally without the need for multiplexing neurons or dealing with contention, routing, and communication bottleneck problems. This simplifies the programming involved compared to electronic implementations.","PeriodicalId":302010,"journal":{"name":"Optical Computing","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optical Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1364/optcomp.1989.md4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Neural network models for artificial intelligence offer an approach fundamentally different from conventional symbolic approaches, but the merits of the two paradigms cannot be fairly compared until neural network models with large numbers of ”neurons” are implemented. Despite the attractiveness of neural networks for computing applications which involve adaptation and learning, most of the published demonstrations of neural network technology have involved relatively small numbers of ”neurons”. One reason for this is the poor match between conventional electronic serial or coarse-grained multiple-processor computers and the massive parallelism and communication requirements of neural network models. The self-pumped optical neural network (SPONN) described here is a fine-grained optical architecture which features massive parallelism and a much greater degree of interconnectivity than bus-oriented or hypercube electronic architectures. SPONN is potentially capable of implementing neural networks consisting of 105-106 neurons with 109-1010 interconnections. The mapping of neural network models onto the architecture occurs naturally without the need for multiplexing neurons or dealing with contention, routing, and communication bottleneck problems. This simplifies the programming involved compared to electronic implementations.