{"title":"模块化极值学习机的进化多任务学习","authors":"Zedong Tang, Maoguo Gong, Mingyang Zhang","doi":"10.1109/CEC.2017.7969349","DOIUrl":null,"url":null,"abstract":"Evolutionary multi-tasking is a novel concept where algorithms utilize the implicit parallelism of population-based search to solve several tasks efficiently. In last decades, multi-task learning, which harnesses the underlying similarity of the learning tasks, has proved efficient in many applications. Extreme learning machine is a distinctive learning algorithm for feed-forward neural networks. Because of its similarity and low computational complexity comparing with the convenient neural network training algorithms, it has been used in many cases of data analyses. In this paper, a modular training technique by employing evolutionary multi-task paradigm is used to evolve the modular topologies of extreme learning machine. Though, extreme learning machine is much faster than the convenient gradient-based method, it needs more hidden neurons due to the random determination of input weights. In proposed method, we combine the evolutionary extreme learning machine and multi-task modular training. Each task is defined by an evolutionary extreme learning machine with different number of hidden neurons. This method produces a modular extreme learning machine which needs less number of hidden units and could be effective even if some hidden neurons and connections are removed. Experiment results show effectiveness and generalization of the proposed method for benchmark classification problems.","PeriodicalId":335123,"journal":{"name":"2017 IEEE Congress on Evolutionary Computation (CEC)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"Evolutionary multi-task learning for modular extremal learning machine\",\"authors\":\"Zedong Tang, Maoguo Gong, Mingyang Zhang\",\"doi\":\"10.1109/CEC.2017.7969349\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Evolutionary multi-tasking is a novel concept where algorithms utilize the implicit parallelism of population-based search to solve several tasks efficiently. In last decades, multi-task learning, which harnesses the underlying similarity of the learning tasks, has proved efficient in many applications. Extreme learning machine is a distinctive learning algorithm for feed-forward neural networks. Because of its similarity and low computational complexity comparing with the convenient neural network training algorithms, it has been used in many cases of data analyses. In this paper, a modular training technique by employing evolutionary multi-task paradigm is used to evolve the modular topologies of extreme learning machine. Though, extreme learning machine is much faster than the convenient gradient-based method, it needs more hidden neurons due to the random determination of input weights. In proposed method, we combine the evolutionary extreme learning machine and multi-task modular training. Each task is defined by an evolutionary extreme learning machine with different number of hidden neurons. This method produces a modular extreme learning machine which needs less number of hidden units and could be effective even if some hidden neurons and connections are removed. Experiment results show effectiveness and generalization of the proposed method for benchmark classification problems.\",\"PeriodicalId\":335123,\"journal\":{\"name\":\"2017 IEEE Congress on Evolutionary Computation (CEC)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE Congress on Evolutionary Computation (CEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CEC.2017.7969349\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Congress on Evolutionary Computation (CEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2017.7969349","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Evolutionary multi-task learning for modular extremal learning machine
Evolutionary multi-tasking is a novel concept where algorithms utilize the implicit parallelism of population-based search to solve several tasks efficiently. In last decades, multi-task learning, which harnesses the underlying similarity of the learning tasks, has proved efficient in many applications. Extreme learning machine is a distinctive learning algorithm for feed-forward neural networks. Because of its similarity and low computational complexity comparing with the convenient neural network training algorithms, it has been used in many cases of data analyses. In this paper, a modular training technique by employing evolutionary multi-task paradigm is used to evolve the modular topologies of extreme learning machine. Though, extreme learning machine is much faster than the convenient gradient-based method, it needs more hidden neurons due to the random determination of input weights. In proposed method, we combine the evolutionary extreme learning machine and multi-task modular training. Each task is defined by an evolutionary extreme learning machine with different number of hidden neurons. This method produces a modular extreme learning machine which needs less number of hidden units and could be effective even if some hidden neurons and connections are removed. Experiment results show effectiveness and generalization of the proposed method for benchmark classification problems.