{"title":"Bidirectional Recurrent Neural Network Language Model: Cross Entropy Churn Metrics for Defect Prediction Modeling","authors":"N. R, K. S","doi":"10.20894/ijdmta.102.009.001.010","DOIUrl":null,"url":null,"abstract":"Software Defect Prediction (SDP) plays an active area in many research domain of Software Quality of Assurance (SQA). Many existing research studies are based on software traditional metric sets and defect prediction models are built in machine language to detect the bug for limited source code line. Inspired by the above existing system. In this paper, defect prediction is focused on predicting defects in source code. The objective of this thesis is to improve the software quality for accurate defect prediction is source code for file level. So, that it helps the developer to find the bug and fix the issue, to make better use of a resource which reduces the test effort, minimize the cost and improve the quality of software. A new approach is introduced to improve the prediction performance of Bidirectional RNNLM in Deep Neural Network. To build the defect prediction model a defect learner framework is proposed and first it need to build a Neural Language Model. Using this Language Model it helps to learn to deep semantic features in source code and it train & test the model. Based on language model it combined with software traditional metric sets to measure the code and find the defect. The probability of language model and metric set Cross-Entropy with Abstract Syntax Tree (CE-AST) metric is used to evaluate the defect proneness and set as a metric label. For classification the metric label K-NN classifier is used. BPTT algorithm for learning RNN will provide additional improvement, it improves the predictions performance to find the dynamic error.","PeriodicalId":414709,"journal":{"name":"International Journal of Data Mining Techniques and Applications","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Data Mining Techniques and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20894/ijdmta.102.009.001.010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Software Defect Prediction (SDP) plays an active area in many research domain of Software Quality of Assurance (SQA). Many existing research studies are based on software traditional metric sets and defect prediction models are built in machine language to detect the bug for limited source code line. Inspired by the above existing system. In this paper, defect prediction is focused on predicting defects in source code. The objective of this thesis is to improve the software quality for accurate defect prediction is source code for file level. So, that it helps the developer to find the bug and fix the issue, to make better use of a resource which reduces the test effort, minimize the cost and improve the quality of software. A new approach is introduced to improve the prediction performance of Bidirectional RNNLM in Deep Neural Network. To build the defect prediction model a defect learner framework is proposed and first it need to build a Neural Language Model. Using this Language Model it helps to learn to deep semantic features in source code and it train & test the model. Based on language model it combined with software traditional metric sets to measure the code and find the defect. The probability of language model and metric set Cross-Entropy with Abstract Syntax Tree (CE-AST) metric is used to evaluate the defect proneness and set as a metric label. For classification the metric label K-NN classifier is used. BPTT algorithm for learning RNN will provide additional improvement, it improves the predictions performance to find the dynamic error.