R. Hochman, T. Khoshgoftaar, E. B. Allen, J. Hudepohl
{"title":"利用遗传算法构建最优神经网络进行易损模块检测","authors":"R. Hochman, T. Khoshgoftaar, E. B. Allen, J. Hudepohl","doi":"10.1109/ISSRE.1996.558759","DOIUrl":null,"url":null,"abstract":"The genetic algorithm is applied to developing optimal or near optimal backpropagation neural networks for fault-prone/not-fault-prone classification of software modules. The algorithm considers each network in a population of neural networks as a potential solution to the optimal classification problem. Variables governing the learning and other parameters and network architecture are represented as substrings (genes) in a machine-level bit string (chromosome). When the population undergoes simulated evolution using genetic operators-selection based on a fitness function, crossover, and mutation-the average performance increases in successive generations. We found that, on the same data, compared with the best manually developed networks, evolved networks produced improved classifications in considerably less time, with no human effort, and with greater confidence in their optimality or near optimality. Strategies for devising a fitness function specific to the problem are explored and discussed.","PeriodicalId":441362,"journal":{"name":"Proceedings of ISSRE '96: 7th International Symposium on Software Reliability Engineering","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1996-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"33","resultStr":"{\"title\":\"Using the genetic algorithm to build optimal neural networks for fault-prone module detection\",\"authors\":\"R. Hochman, T. Khoshgoftaar, E. B. Allen, J. Hudepohl\",\"doi\":\"10.1109/ISSRE.1996.558759\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The genetic algorithm is applied to developing optimal or near optimal backpropagation neural networks for fault-prone/not-fault-prone classification of software modules. The algorithm considers each network in a population of neural networks as a potential solution to the optimal classification problem. Variables governing the learning and other parameters and network architecture are represented as substrings (genes) in a machine-level bit string (chromosome). When the population undergoes simulated evolution using genetic operators-selection based on a fitness function, crossover, and mutation-the average performance increases in successive generations. We found that, on the same data, compared with the best manually developed networks, evolved networks produced improved classifications in considerably less time, with no human effort, and with greater confidence in their optimality or near optimality. Strategies for devising a fitness function specific to the problem are explored and discussed.\",\"PeriodicalId\":441362,\"journal\":{\"name\":\"Proceedings of ISSRE '96: 7th International Symposium on Software Reliability Engineering\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1996-10-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"33\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of ISSRE '96: 7th International Symposium on Software Reliability Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISSRE.1996.558759\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of ISSRE '96: 7th International Symposium on Software Reliability Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSRE.1996.558759","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using the genetic algorithm to build optimal neural networks for fault-prone module detection
The genetic algorithm is applied to developing optimal or near optimal backpropagation neural networks for fault-prone/not-fault-prone classification of software modules. The algorithm considers each network in a population of neural networks as a potential solution to the optimal classification problem. Variables governing the learning and other parameters and network architecture are represented as substrings (genes) in a machine-level bit string (chromosome). When the population undergoes simulated evolution using genetic operators-selection based on a fitness function, crossover, and mutation-the average performance increases in successive generations. We found that, on the same data, compared with the best manually developed networks, evolved networks produced improved classifications in considerably less time, with no human effort, and with greater confidence in their optimality or near optimality. Strategies for devising a fitness function specific to the problem are explored and discussed.