Alison A Motsinger, Lance W Hahn, Scott M Dudek, Kelli K Ryckman, Marylyn D Ritchie
{"title":"Alternative Cross-Over Strategies and Selection Techniques for Grammatical Evolution Optimized Neural Networks.","authors":"Alison A Motsinger, Lance W Hahn, Scott M Dudek, Kelli K Ryckman, Marylyn D Ritchie","doi":"10.1145/1143997.1144163","DOIUrl":null,"url":null,"abstract":"One of the most difficult challenges in human genetics is the identification and characterization of susceptibility genes for common complex human diseases. The presence of gene-gene and gene-environment interactions comprising the genetic architecture of these diseases presents a substantial statistical challenge. As the field pushes toward genome-wide association studies with hundreds of thousands, or even millions, of variables, the development of novel statistical and computational methods is a necessity. Previously, we introduced a grammatical evolution optimized NN (GENN) to improve upon the trial-and-error process of choosing an optimal architecture for a pure feed-forward back propagation neural network. GENN optimizes the inputs from a large pool of variables, the weights, and the connectivity of the network - including the number of hidden layers and the number of nodes in the hidden layer. Thus, the algorithm automatically generates optimal neural network architecture for a given data set. \n \nLike all evolutionary computing algorithms, grammatical evolution relies on evolutionary operators like crossover and selection to learn the best solution for a given dataset. We wanted to understand the effect of fitness proportionate versus ordinal selection schemes, and the effect of standard and novel crossover strategies on the performance of GENN.","PeriodicalId":88876,"journal":{"name":"Genetic and Evolutionary Computation Conference : [proceedings]. Genetic and Evolutionary Computation Conference","volume":"2006 ","pages":"947-948"},"PeriodicalIF":0.0000,"publicationDate":"2006-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/1143997.1144163","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Genetic and Evolutionary Computation Conference : [proceedings]. Genetic and Evolutionary Computation Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1143997.1144163","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
Abstract
One of the most difficult challenges in human genetics is the identification and characterization of susceptibility genes for common complex human diseases. The presence of gene-gene and gene-environment interactions comprising the genetic architecture of these diseases presents a substantial statistical challenge. As the field pushes toward genome-wide association studies with hundreds of thousands, or even millions, of variables, the development of novel statistical and computational methods is a necessity. Previously, we introduced a grammatical evolution optimized NN (GENN) to improve upon the trial-and-error process of choosing an optimal architecture for a pure feed-forward back propagation neural network. GENN optimizes the inputs from a large pool of variables, the weights, and the connectivity of the network - including the number of hidden layers and the number of nodes in the hidden layer. Thus, the algorithm automatically generates optimal neural network architecture for a given data set.
Like all evolutionary computing algorithms, grammatical evolution relies on evolutionary operators like crossover and selection to learn the best solution for a given dataset. We wanted to understand the effect of fitness proportionate versus ordinal selection schemes, and the effect of standard and novel crossover strategies on the performance of GENN.