{"title":"极限学习机实现回归问题的数值方面","authors":"J. Kabzinski","doi":"10.1109/MMAR.2018.8485866","DOIUrl":null,"url":null,"abstract":"An Extreme Learning Machine (ELM) – a neural network with fixed hidden layer and adjustable output weights is able to solve complicated regression (approximation) problems, but the standard selection of input weights and biases may lead to ill-conditioning of the output weights calculation and result in high values of the output weights. Two modifications of standard ELM are discussed: deterministic generation of hidden nodes parameters and modifications of activation functions to improve numerical properties of the algorithm.","PeriodicalId":201658,"journal":{"name":"2018 23rd International Conference on Methods & Models in Automation & Robotics (MMAR)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Numerical Aspects of Extreme Learning Machine Implementation to Regression Problems\",\"authors\":\"J. Kabzinski\",\"doi\":\"10.1109/MMAR.2018.8485866\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An Extreme Learning Machine (ELM) – a neural network with fixed hidden layer and adjustable output weights is able to solve complicated regression (approximation) problems, but the standard selection of input weights and biases may lead to ill-conditioning of the output weights calculation and result in high values of the output weights. Two modifications of standard ELM are discussed: deterministic generation of hidden nodes parameters and modifications of activation functions to improve numerical properties of the algorithm.\",\"PeriodicalId\":201658,\"journal\":{\"name\":\"2018 23rd International Conference on Methods & Models in Automation & Robotics (MMAR)\",\"volume\":\"77 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 23rd International Conference on Methods & Models in Automation & Robotics (MMAR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MMAR.2018.8485866\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 23rd International Conference on Methods & Models in Automation & Robotics (MMAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MMAR.2018.8485866","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Numerical Aspects of Extreme Learning Machine Implementation to Regression Problems
An Extreme Learning Machine (ELM) – a neural network with fixed hidden layer and adjustable output weights is able to solve complicated regression (approximation) problems, but the standard selection of input weights and biases may lead to ill-conditioning of the output weights calculation and result in high values of the output weights. Two modifications of standard ELM are discussed: deterministic generation of hidden nodes parameters and modifications of activation functions to improve numerical properties of the algorithm.