Miguel Domingue, Rohan Narendra Dhamdhere, Naga Durga Harish Kanamarlapudi, Sunand Raghupathi, R. Ptucha
{"title":"图分类器的进化","authors":"Miguel Domingue, Rohan Narendra Dhamdhere, Naga Durga Harish Kanamarlapudi, Sunand Raghupathi, R. Ptucha","doi":"10.1109/WNYIPW.2019.8923110","DOIUrl":null,"url":null,"abstract":"Architecture design and hyperparameter selection for deep neural networks often involves guesswork. The parameter space is too large to try all possibilities, meaning one often settles for a suboptimal solution. Some works have proposed automatic architecture and hyperparameter search, but are constrained to image applications. We propose an evolution framework for graph data which is extensible to generic graphs. Our evolution mutates a population of neural networks to search the architecture and hyperparameter space. At each stage of the neuroevolution process, neural network layers can be added or removed, hyperparameters can be adjusted, or additional epochs of training can be applied. Probabilities of the mutation selection based on recent successes help guide the learning process for efficient and accurate learning. We achieve state-of-the-art on MUTAG protein classification from a small population of 10 networks and gain interesting insight into how to build effective network architectures incrementally.","PeriodicalId":275099,"journal":{"name":"2019 IEEE Western New York Image and Signal Processing Workshop (WNYISPW)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Evolution of Graph Classifiers\",\"authors\":\"Miguel Domingue, Rohan Narendra Dhamdhere, Naga Durga Harish Kanamarlapudi, Sunand Raghupathi, R. Ptucha\",\"doi\":\"10.1109/WNYIPW.2019.8923110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Architecture design and hyperparameter selection for deep neural networks often involves guesswork. The parameter space is too large to try all possibilities, meaning one often settles for a suboptimal solution. Some works have proposed automatic architecture and hyperparameter search, but are constrained to image applications. We propose an evolution framework for graph data which is extensible to generic graphs. Our evolution mutates a population of neural networks to search the architecture and hyperparameter space. At each stage of the neuroevolution process, neural network layers can be added or removed, hyperparameters can be adjusted, or additional epochs of training can be applied. Probabilities of the mutation selection based on recent successes help guide the learning process for efficient and accurate learning. We achieve state-of-the-art on MUTAG protein classification from a small population of 10 networks and gain interesting insight into how to build effective network architectures incrementally.\",\"PeriodicalId\":275099,\"journal\":{\"name\":\"2019 IEEE Western New York Image and Signal Processing Workshop (WNYISPW)\",\"volume\":\"20 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE Western New York Image and Signal Processing Workshop (WNYISPW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WNYIPW.2019.8923110\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Western New York Image and Signal Processing Workshop (WNYISPW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WNYIPW.2019.8923110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Architecture design and hyperparameter selection for deep neural networks often involves guesswork. The parameter space is too large to try all possibilities, meaning one often settles for a suboptimal solution. Some works have proposed automatic architecture and hyperparameter search, but are constrained to image applications. We propose an evolution framework for graph data which is extensible to generic graphs. Our evolution mutates a population of neural networks to search the architecture and hyperparameter space. At each stage of the neuroevolution process, neural network layers can be added or removed, hyperparameters can be adjusted, or additional epochs of training can be applied. Probabilities of the mutation selection based on recent successes help guide the learning process for efficient and accurate learning. We achieve state-of-the-art on MUTAG protein classification from a small population of 10 networks and gain interesting insight into how to build effective network architectures incrementally.