{"title":"外监督前馈神经网络零代价学习的条件","authors":"De-shuang Huang","doi":"10.1109/IJCNN.1999.831061","DOIUrl":null,"url":null,"abstract":"This paper investigates, from the viewpoint of linear algebra, the local minima of least square error cost functions defined at the outputs of outer-supervised feedforward neural networks (FNN). For a specific case, we also show that those spacedly colinear samples (probably output by the final hidden layer) will be easily separated with null-cost error function even if the condition M/spl ges/N is not satisfied. In the light of these conclusions we shall give a general method for designing a suitable architecture network to solve a specific problem.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On the conditions of outer-supervised feedforward neural networks for null cost learning\",\"authors\":\"De-shuang Huang\",\"doi\":\"10.1109/IJCNN.1999.831061\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper investigates, from the viewpoint of linear algebra, the local minima of least square error cost functions defined at the outputs of outer-supervised feedforward neural networks (FNN). For a specific case, we also show that those spacedly colinear samples (probably output by the final hidden layer) will be easily separated with null-cost error function even if the condition M/spl ges/N is not satisfied. In the light of these conclusions we shall give a general method for designing a suitable architecture network to solve a specific problem.\",\"PeriodicalId\":157719,\"journal\":{\"name\":\"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1999-07-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1999.831061\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1999.831061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
On the conditions of outer-supervised feedforward neural networks for null cost learning
This paper investigates, from the viewpoint of linear algebra, the local minima of least square error cost functions defined at the outputs of outer-supervised feedforward neural networks (FNN). For a specific case, we also show that those spacedly colinear samples (probably output by the final hidden layer) will be easily separated with null-cost error function even if the condition M/spl ges/N is not satisfied. In the light of these conclusions we shall give a general method for designing a suitable architecture network to solve a specific problem.