{"title":"RBF网络精度的经验改进","authors":"H. Sug","doi":"10.1145/1655925.1656053","DOIUrl":null,"url":null,"abstract":"Neural networks have been developed for machine learning and data mining tasks, and because data mining problems contain a large amount of data, sampling is a necessity for the success of the task. Radial basis function networks are one of representative neural network algorithms, and known to have good prediction accuracy in many applications, but it is not known to decide a proper sample size like other data mining algorithms, so the task of deciding proper sample sizes for the neural networks tends to be arbitrary. As the size of samples grows, the improvement in error rates becomes better slowly. But we cannot use larger and larger samples for the networks, because there is some fluctuation in accuracy depending on the samples. This paper suggests a progressive resampling technique to cope with the situation. The suggestion is proved by experiments with very promising results.","PeriodicalId":122831,"journal":{"name":"Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"An empirical improvement of the accuracy of RBF networks\",\"authors\":\"H. Sug\",\"doi\":\"10.1145/1655925.1656053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Neural networks have been developed for machine learning and data mining tasks, and because data mining problems contain a large amount of data, sampling is a necessity for the success of the task. Radial basis function networks are one of representative neural network algorithms, and known to have good prediction accuracy in many applications, but it is not known to decide a proper sample size like other data mining algorithms, so the task of deciding proper sample sizes for the neural networks tends to be arbitrary. As the size of samples grows, the improvement in error rates becomes better slowly. But we cannot use larger and larger samples for the networks, because there is some fluctuation in accuracy depending on the samples. This paper suggests a progressive resampling technique to cope with the situation. The suggestion is proved by experiments with very promising results.\",\"PeriodicalId\":122831,\"journal\":{\"name\":\"Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-11-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/1655925.1656053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1655925.1656053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An empirical improvement of the accuracy of RBF networks
Neural networks have been developed for machine learning and data mining tasks, and because data mining problems contain a large amount of data, sampling is a necessity for the success of the task. Radial basis function networks are one of representative neural network algorithms, and known to have good prediction accuracy in many applications, but it is not known to decide a proper sample size like other data mining algorithms, so the task of deciding proper sample sizes for the neural networks tends to be arbitrary. As the size of samples grows, the improvement in error rates becomes better slowly. But we cannot use larger and larger samples for the networks, because there is some fluctuation in accuracy depending on the samples. This paper suggests a progressive resampling technique to cope with the situation. The suggestion is proved by experiments with very promising results.