Anas Ali Alhur, Zuheir N Khlaif, Bilal Hamamra, Elham Hussein
{"title":"高等教育中人工智能的悖论:巴勒斯坦教育工作者对人工智能依赖的定性探究。","authors":"Anas Ali Alhur, Zuheir N Khlaif, Bilal Hamamra, Elham Hussein","doi":"10.2196/74947","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Artificial intelligence (AI) is increasingly embedded in medical education, providing benefits in instructional design, content creation, and administrative efficiency. Tools like ChatGPT are reshaping training and teaching practices in digital health. However, concerns about faculty overreliance highlight risks to pedagogical autonomy, cognitive engagement, and ethics. Despite global interest, there is limited empirical research on AI dependency among medical educators, particularly in underrepresented regions like the Global South.</p><p><strong>Objective: </strong>This study focused on Palestine and aimed to (1) identify factors contributing to AI dependency among medical educators, (2) assess its impact on teaching autonomy, decision-making, and professional identity, and (3) propose strategies for sustainable and responsible AI integration in digital medical education.</p><p><strong>Methods: </strong>A qualitative research design was used, using semistructured interviews (n=22) and focus group discussions (n=24) involving 46 medical educators from nursing, pharmacy, medicine, optometry, and dental sciences. Thematic analysis, supported by NVivo (QSR International), was conducted on 15.5 hours of transcribed data. Participants varied in their frequency of AI use: 45.7% (21/46) used AI daily, 30.4% (14/46) weekly, and 15.2% (7/46) monthly.</p><p><strong>Results: </strong>In total, 5 major themes were identified as drivers of AI dependency: institutional workload (reported by >80% [37/46] of participants), low academic confidence (noted by 28/46, 60%), and perfectionism-related stress (23/46, 50%). The following 6 broad consequences of AI overreliance were identified: Skills Atrophy (reported by 89% [41/46]): educators reported reduced critical thinking, scientific writing, and decision-making abilities. Pedagogical erosion (35/46, 76%): decreased student interaction and reduced teaching innovation. Motivational decline (31/46, 67%): increased procrastination and reduced intrinsic motivation. Ethical risks (24/46, 52%): concerns about plagiarism and overuse of AI-generated content. Social fragmentation (22/46, 48%): diminished peer collaboration and mentorship. Creativity suppression (20/46, 43%): reliance on AI for content generation diluted instructional originality., Strategies reported by participants to address these issues included establishing boundaries for AI use (n=41), fostering hybrid intelligence (n=37), and integrating AI literacy into teaching practices (n=39).</p><p><strong>Conclusions: </strong>While AI tools can enhance digital health instruction, unchecked reliance risks eroding essential clinician competencies. This study identifies cognitive, pedagogical, and ethical consequences of AI overuse in medical education and highlights the need for AI literacy, professional development, and ethical frameworks to ensure responsible and balanced integration.</p>","PeriodicalId":36236,"journal":{"name":"JMIR Medical Education","volume":"11 ","pages":"e74947"},"PeriodicalIF":3.2000,"publicationDate":"2025-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12435755/pdf/","citationCount":"0","resultStr":"{\"title\":\"Paradox of AI in Higher Education: Qualitative Inquiry Into AI Dependency Among Educators in Palestine.\",\"authors\":\"Anas Ali Alhur, Zuheir N Khlaif, Bilal Hamamra, Elham Hussein\",\"doi\":\"10.2196/74947\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Artificial intelligence (AI) is increasingly embedded in medical education, providing benefits in instructional design, content creation, and administrative efficiency. Tools like ChatGPT are reshaping training and teaching practices in digital health. However, concerns about faculty overreliance highlight risks to pedagogical autonomy, cognitive engagement, and ethics. Despite global interest, there is limited empirical research on AI dependency among medical educators, particularly in underrepresented regions like the Global South.</p><p><strong>Objective: </strong>This study focused on Palestine and aimed to (1) identify factors contributing to AI dependency among medical educators, (2) assess its impact on teaching autonomy, decision-making, and professional identity, and (3) propose strategies for sustainable and responsible AI integration in digital medical education.</p><p><strong>Methods: </strong>A qualitative research design was used, using semistructured interviews (n=22) and focus group discussions (n=24) involving 46 medical educators from nursing, pharmacy, medicine, optometry, and dental sciences. Thematic analysis, supported by NVivo (QSR International), was conducted on 15.5 hours of transcribed data. Participants varied in their frequency of AI use: 45.7% (21/46) used AI daily, 30.4% (14/46) weekly, and 15.2% (7/46) monthly.</p><p><strong>Results: </strong>In total, 5 major themes were identified as drivers of AI dependency: institutional workload (reported by >80% [37/46] of participants), low academic confidence (noted by 28/46, 60%), and perfectionism-related stress (23/46, 50%). The following 6 broad consequences of AI overreliance were identified: Skills Atrophy (reported by 89% [41/46]): educators reported reduced critical thinking, scientific writing, and decision-making abilities. Pedagogical erosion (35/46, 76%): decreased student interaction and reduced teaching innovation. Motivational decline (31/46, 67%): increased procrastination and reduced intrinsic motivation. Ethical risks (24/46, 52%): concerns about plagiarism and overuse of AI-generated content. Social fragmentation (22/46, 48%): diminished peer collaboration and mentorship. Creativity suppression (20/46, 43%): reliance on AI for content generation diluted instructional originality., Strategies reported by participants to address these issues included establishing boundaries for AI use (n=41), fostering hybrid intelligence (n=37), and integrating AI literacy into teaching practices (n=39).</p><p><strong>Conclusions: </strong>While AI tools can enhance digital health instruction, unchecked reliance risks eroding essential clinician competencies. This study identifies cognitive, pedagogical, and ethical consequences of AI overuse in medical education and highlights the need for AI literacy, professional development, and ethical frameworks to ensure responsible and balanced integration.</p>\",\"PeriodicalId\":36236,\"journal\":{\"name\":\"JMIR Medical Education\",\"volume\":\"11 \",\"pages\":\"e74947\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2025-09-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12435755/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"JMIR Medical Education\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2196/74947\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"JMIR Medical Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2196/74947","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Paradox of AI in Higher Education: Qualitative Inquiry Into AI Dependency Among Educators in Palestine.
Background: Artificial intelligence (AI) is increasingly embedded in medical education, providing benefits in instructional design, content creation, and administrative efficiency. Tools like ChatGPT are reshaping training and teaching practices in digital health. However, concerns about faculty overreliance highlight risks to pedagogical autonomy, cognitive engagement, and ethics. Despite global interest, there is limited empirical research on AI dependency among medical educators, particularly in underrepresented regions like the Global South.
Objective: This study focused on Palestine and aimed to (1) identify factors contributing to AI dependency among medical educators, (2) assess its impact on teaching autonomy, decision-making, and professional identity, and (3) propose strategies for sustainable and responsible AI integration in digital medical education.
Methods: A qualitative research design was used, using semistructured interviews (n=22) and focus group discussions (n=24) involving 46 medical educators from nursing, pharmacy, medicine, optometry, and dental sciences. Thematic analysis, supported by NVivo (QSR International), was conducted on 15.5 hours of transcribed data. Participants varied in their frequency of AI use: 45.7% (21/46) used AI daily, 30.4% (14/46) weekly, and 15.2% (7/46) monthly.
Results: In total, 5 major themes were identified as drivers of AI dependency: institutional workload (reported by >80% [37/46] of participants), low academic confidence (noted by 28/46, 60%), and perfectionism-related stress (23/46, 50%). The following 6 broad consequences of AI overreliance were identified: Skills Atrophy (reported by 89% [41/46]): educators reported reduced critical thinking, scientific writing, and decision-making abilities. Pedagogical erosion (35/46, 76%): decreased student interaction and reduced teaching innovation. Motivational decline (31/46, 67%): increased procrastination and reduced intrinsic motivation. Ethical risks (24/46, 52%): concerns about plagiarism and overuse of AI-generated content. Social fragmentation (22/46, 48%): diminished peer collaboration and mentorship. Creativity suppression (20/46, 43%): reliance on AI for content generation diluted instructional originality., Strategies reported by participants to address these issues included establishing boundaries for AI use (n=41), fostering hybrid intelligence (n=37), and integrating AI literacy into teaching practices (n=39).
Conclusions: While AI tools can enhance digital health instruction, unchecked reliance risks eroding essential clinician competencies. This study identifies cognitive, pedagogical, and ethical consequences of AI overuse in medical education and highlights the need for AI literacy, professional development, and ethical frameworks to ensure responsible and balanced integration.