Francisco J. Pena, Angel Luis Gonzalez, Sepideh Pashami, A. Al-Shishtawy, A. H. Payberah
{"title":"Siambert: Siamese Bert-based Code Search","authors":"Francisco J. Pena, Angel Luis Gonzalez, Sepideh Pashami, A. Al-Shishtawy, A. H. Payberah","doi":"10.1109/sais55783.2022.9833051","DOIUrl":null,"url":null,"abstract":"Code Search is a practical tool that helps developers navigate growing source code repositories by connecting natural language queries with code snippets. Platforms such as StackOverflow resolve coding questions and answers; however, they cannot perform a semantic search through the code. Moreover, poorly documented code adds more complexity to search for code snippets in repositories. To tackle this challenge, this paper presents Siambert, a BERT-based model that gets the question in natural language and returns relevant code snippets. The Siambert architecture consists of two stages, where the first stage, inspired by Siamese Neural Network, returns the top K relevant code snippets to the input questions, and the second stage ranks the given snippets by the first stage. The experiments show that Siambert outperforms non-BERT-based models having improvements that range from 12% to 39% on the Recall@1 metric and improves the inference time performance, making it 15x faster than standard BERT models.","PeriodicalId":228143,"journal":{"name":"2022 Swedish Artificial Intelligence Society Workshop (SAIS)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Swedish Artificial Intelligence Society Workshop (SAIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/sais55783.2022.9833051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Code Search is a practical tool that helps developers navigate growing source code repositories by connecting natural language queries with code snippets. Platforms such as StackOverflow resolve coding questions and answers; however, they cannot perform a semantic search through the code. Moreover, poorly documented code adds more complexity to search for code snippets in repositories. To tackle this challenge, this paper presents Siambert, a BERT-based model that gets the question in natural language and returns relevant code snippets. The Siambert architecture consists of two stages, where the first stage, inspired by Siamese Neural Network, returns the top K relevant code snippets to the input questions, and the second stage ranks the given snippets by the first stage. The experiments show that Siambert outperforms non-BERT-based models having improvements that range from 12% to 39% on the Recall@1 metric and improves the inference time performance, making it 15x faster than standard BERT models.