Silvia García-Méndez, Francisco de Arriba-Pérez, María del Carmen Somoza-López
{"title":"A Review on the Use of Large Language Models as Virtual Tutors","authors":"Silvia García-Méndez, Francisco de Arriba-Pérez, María del Carmen Somoza-López","doi":"10.1007/s11191-024-00530-2","DOIUrl":null,"url":null,"abstract":"<div><p>Transformer architectures contribute to managing long-term dependencies for natural language processing, representing one of the most recent changes in the field. These architectures are the basis of the innovative, cutting-edge large language models (LLMs) that have produced a huge buzz in several fields and industrial sectors, among the ones education stands out. Accordingly, these generative artificial intelligence-based solutions have directed the change in techniques and the evolution in educational methods and contents, along with network infrastructure, towards high-quality learning. Given the popularity of LLMs, this review seeks to provide a comprehensive overview of those solutions designed specifically to generate and evaluate educational materials and which involve students and teachers in their design or experimental plan. To the best of our knowledge, this is the first review of educational applications (e.g., student assessment) of LLMs. As expected, the most common role of these systems is as virtual tutors for automatic question generation. Moreover, the most popular models are GPT-3 and BERT. However, due to the continuous launch of new generative models, new works are expected to be published shortly.</p></div>","PeriodicalId":771,"journal":{"name":"Science & Education","volume":"34 2","pages":"877 - 892"},"PeriodicalIF":3.1000,"publicationDate":"2024-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s11191-024-00530-2.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science & Education","FirstCategoryId":"95","ListUrlMain":"https://link.springer.com/article/10.1007/s11191-024-00530-2","RegionNum":1,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Transformer architectures contribute to managing long-term dependencies for natural language processing, representing one of the most recent changes in the field. These architectures are the basis of the innovative, cutting-edge large language models (LLMs) that have produced a huge buzz in several fields and industrial sectors, among the ones education stands out. Accordingly, these generative artificial intelligence-based solutions have directed the change in techniques and the evolution in educational methods and contents, along with network infrastructure, towards high-quality learning. Given the popularity of LLMs, this review seeks to provide a comprehensive overview of those solutions designed specifically to generate and evaluate educational materials and which involve students and teachers in their design or experimental plan. To the best of our knowledge, this is the first review of educational applications (e.g., student assessment) of LLMs. As expected, the most common role of these systems is as virtual tutors for automatic question generation. Moreover, the most popular models are GPT-3 and BERT. However, due to the continuous launch of new generative models, new works are expected to be published shortly.
期刊介绍:
Science Education publishes original articles on the latest issues and trends occurring internationally in science curriculum, instruction, learning, policy and preparation of science teachers with the aim to advance our knowledge of science education theory and practice. In addition to original articles, the journal features the following special sections: -Learning : consisting of theoretical and empirical research studies on learning of science. We invite manuscripts that investigate learning and its change and growth from various lenses, including psychological, social, cognitive, sociohistorical, and affective. Studies examining the relationship of learning to teaching, the science knowledge and practices, the learners themselves, and the contexts (social, political, physical, ideological, institutional, epistemological, and cultural) are similarly welcome. -Issues and Trends : consisting primarily of analytical, interpretive, or persuasive essays on current educational, social, or philosophical issues and trends relevant to the teaching of science. This special section particularly seeks to promote informed dialogues about current issues in science education, and carefully reasoned papers representing disparate viewpoints are welcomed. Manuscripts submitted for this section may be in the form of a position paper, a polemical piece, or a creative commentary. -Science Learning in Everyday Life : consisting of analytical, interpretative, or philosophical papers regarding learning science outside of the formal classroom. Papers should investigate experiences in settings such as community, home, the Internet, after school settings, museums, and other opportunities that develop science interest, knowledge or practices across the life span. Attention to issues and factors relating to equity in science learning are especially encouraged.. -Science Teacher Education [...]