{"title":"使用自组织地图的增量学习","authors":"A. Gepperth, Cem Karaoguz","doi":"10.1109/WSOM.2017.8020021","DOIUrl":null,"url":null,"abstract":"We present a novel use for self-organizing maps (SOMs) as an essential building block for incremental learning algorithms. SOMs are very well suited for this purpose because they are inherently online learning algorithms, because their weight updates are localized around the best-matching unit, which inherently protects them against catastrophic forgetting, and last but not least because they have fixed model complexity limiting execution time and memory requirements for processing streaming data. However, in order to perform incremental learning which is usually supervised in nature, SOMs need to be complemented by a readout layer as well as a self-referential control mechanism for prototype updates in order to be protected against negative consequences of concept drift. We present the PROPRE architecture which implements these functions, thus realizing incremental learning with SOMs in very high-dimensional data domains, and show its capacity for incremental learning on several known and new classification problems. In particular, we discuss the required control of SOM parameters in detail and validate our choices by experimental results.","PeriodicalId":130086,"journal":{"name":"2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (WSOM)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Incremental learning with self-organizing maps\",\"authors\":\"A. Gepperth, Cem Karaoguz\",\"doi\":\"10.1109/WSOM.2017.8020021\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a novel use for self-organizing maps (SOMs) as an essential building block for incremental learning algorithms. SOMs are very well suited for this purpose because they are inherently online learning algorithms, because their weight updates are localized around the best-matching unit, which inherently protects them against catastrophic forgetting, and last but not least because they have fixed model complexity limiting execution time and memory requirements for processing streaming data. However, in order to perform incremental learning which is usually supervised in nature, SOMs need to be complemented by a readout layer as well as a self-referential control mechanism for prototype updates in order to be protected against negative consequences of concept drift. We present the PROPRE architecture which implements these functions, thus realizing incremental learning with SOMs in very high-dimensional data domains, and show its capacity for incremental learning on several known and new classification problems. In particular, we discuss the required control of SOM parameters in detail and validate our choices by experimental results.\",\"PeriodicalId\":130086,\"journal\":{\"name\":\"2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (WSOM)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (WSOM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WSOM.2017.8020021\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (WSOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WSOM.2017.8020021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We present a novel use for self-organizing maps (SOMs) as an essential building block for incremental learning algorithms. SOMs are very well suited for this purpose because they are inherently online learning algorithms, because their weight updates are localized around the best-matching unit, which inherently protects them against catastrophic forgetting, and last but not least because they have fixed model complexity limiting execution time and memory requirements for processing streaming data. However, in order to perform incremental learning which is usually supervised in nature, SOMs need to be complemented by a readout layer as well as a self-referential control mechanism for prototype updates in order to be protected against negative consequences of concept drift. We present the PROPRE architecture which implements these functions, thus realizing incremental learning with SOMs in very high-dimensional data domains, and show its capacity for incremental learning on several known and new classification problems. In particular, we discuss the required control of SOM parameters in detail and validate our choices by experimental results.