Prabhsimran Singh, Younes Boubekeur, G. Mussbacher
{"title":"检测领域模型中的错误","authors":"Prabhsimran Singh, Younes Boubekeur, G. Mussbacher","doi":"10.1145/3550356.3561583","DOIUrl":null,"url":null,"abstract":"Domain models are a fundamental part of software engineering, and it is important for every software engineer to be taught the principles of domain modeling. Instructors play a vital role in teaching students the skills required to understand and design domain models. Instructors check models created by students for mistakes by comparing them with a correct solution. While this did not use to be an overwhelming task, this is not the case anymore nowadays due to a rapid increase in the number of students wanting to become software engineers, leading to larger class sizes. Hence, students may need to wait for a longer time to get feedback on their solutions and the feedback may be more superficial due to time constraints. In this paper, we propose a mistake detection system (MDS) that aims to automate the manual approach of checking student solutions and help save both students' and instructors' time. MDS automatically indicates the exact location and the type of the mistake to the student. At present, MDS accurately detects 83 out of 97 identified different types of mistakes that may exist in a student solution. A prototype tool verifies the feasibility of the proposed approach. When synonyms are considered by MDS, recall of 0.93 and precision of 0.79 are achieved based on the results for real student solutions. The proposed MDS takes us one step closer to automating the existing manual approach, freeing up instructor time and helping students learn domain modeling more effectively.","PeriodicalId":182662,"journal":{"name":"Proceedings of the 25th International Conference on Model Driven Engineering Languages and Systems: Companion Proceedings","volume":"87 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Detecting mistakes in a domain model\",\"authors\":\"Prabhsimran Singh, Younes Boubekeur, G. Mussbacher\",\"doi\":\"10.1145/3550356.3561583\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Domain models are a fundamental part of software engineering, and it is important for every software engineer to be taught the principles of domain modeling. Instructors play a vital role in teaching students the skills required to understand and design domain models. Instructors check models created by students for mistakes by comparing them with a correct solution. While this did not use to be an overwhelming task, this is not the case anymore nowadays due to a rapid increase in the number of students wanting to become software engineers, leading to larger class sizes. Hence, students may need to wait for a longer time to get feedback on their solutions and the feedback may be more superficial due to time constraints. In this paper, we propose a mistake detection system (MDS) that aims to automate the manual approach of checking student solutions and help save both students' and instructors' time. MDS automatically indicates the exact location and the type of the mistake to the student. At present, MDS accurately detects 83 out of 97 identified different types of mistakes that may exist in a student solution. A prototype tool verifies the feasibility of the proposed approach. When synonyms are considered by MDS, recall of 0.93 and precision of 0.79 are achieved based on the results for real student solutions. The proposed MDS takes us one step closer to automating the existing manual approach, freeing up instructor time and helping students learn domain modeling more effectively.\",\"PeriodicalId\":182662,\"journal\":{\"name\":\"Proceedings of the 25th International Conference on Model Driven Engineering Languages and Systems: Companion Proceedings\",\"volume\":\"87 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 25th International Conference on Model Driven Engineering Languages and Systems: Companion Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3550356.3561583\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 25th International Conference on Model Driven Engineering Languages and Systems: Companion Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3550356.3561583","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Domain models are a fundamental part of software engineering, and it is important for every software engineer to be taught the principles of domain modeling. Instructors play a vital role in teaching students the skills required to understand and design domain models. Instructors check models created by students for mistakes by comparing them with a correct solution. While this did not use to be an overwhelming task, this is not the case anymore nowadays due to a rapid increase in the number of students wanting to become software engineers, leading to larger class sizes. Hence, students may need to wait for a longer time to get feedback on their solutions and the feedback may be more superficial due to time constraints. In this paper, we propose a mistake detection system (MDS) that aims to automate the manual approach of checking student solutions and help save both students' and instructors' time. MDS automatically indicates the exact location and the type of the mistake to the student. At present, MDS accurately detects 83 out of 97 identified different types of mistakes that may exist in a student solution. A prototype tool verifies the feasibility of the proposed approach. When synonyms are considered by MDS, recall of 0.93 and precision of 0.79 are achieved based on the results for real student solutions. The proposed MDS takes us one step closer to automating the existing manual approach, freeing up instructor time and helping students learn domain modeling more effectively.