Jae-Hong Lee, So-Hae Oh, Falk Schwendicke, Akhilanand Chaurasia, Young-Taek Kim
{"title":"评估韩国牙周病学会在线问答部分对患者问题的回答的质量和同理心:一项比较牙周病医生和人工智能聊天机器人的横断面研究。","authors":"Jae-Hong Lee, So-Hae Oh, Falk Schwendicke, Akhilanand Chaurasia, Young-Taek Kim","doi":"10.5051/jpis.2402220111","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>This study aimed to evaluate and compare the responses of an artificial intelligence (AI)-powered chatbot and professional periodontists to patient queries in periodontology and implantology, using the Korean Academy of Periodontology's (KAP) online question and answer (Q&A) section.</p><p><strong>Methods: </strong>In this comparative cross-sectional study, we analyzed 219 patient-submitted periodontal and implant knowledge questions from the KAP online Q&A section. A panel of 10 evaluators-5 periodontists and 5 laypersons-rated both the periodontist's and the AI chatbot's responses using standardized scales. We applied the <i>t</i>-test and Spearman correlation coefficients to compare response quality, empathy, consistency, and evaluator preferences.</p><p><strong>Results: </strong>Ten evaluators judged the AI chatbot's responses to be significantly superior in quality and empathy compared to periodontist replies. A higher proportion of periodontist responses fell below acceptable quality (\"very poor\" or \"poor\") than chatbot responses (28.7% vs. 15.0%; <i>P</i><0.001), and more chatbot replies were rated \"empathetic\" or \"very empathetic\" (62.5% vs. 42.8%; <i>P</i><0.001). Overall response consistency was deemed satisfactory at 64.2%, with no significant difference in consistency or preference between periodontist and lay evaluators.</p><p><strong>Conclusions: </strong>AI-powered chatbots can deliver more accurate and empathetic answers than human periodontists, suggesting their potential role as consultation assistants merits further investigation. The high intraclass correlation coefficient values (0.79-0.93) indicate a high level of agreement among evaluators in both the periodontist and lay evaluator groups, thus confirming the reliability and robustness of the study's assessment methodology.</p>","PeriodicalId":48795,"journal":{"name":"Journal of Periodontal and Implant Science","volume":" ","pages":""},"PeriodicalIF":3.2000,"publicationDate":"2025-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evaluating the quality and empathy of responses to patient questions on the Korean Academy of Periodontology's online question and answer section: a cross-sectional study comparing periodontists and an AI-powered chatbot.\",\"authors\":\"Jae-Hong Lee, So-Hae Oh, Falk Schwendicke, Akhilanand Chaurasia, Young-Taek Kim\",\"doi\":\"10.5051/jpis.2402220111\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>This study aimed to evaluate and compare the responses of an artificial intelligence (AI)-powered chatbot and professional periodontists to patient queries in periodontology and implantology, using the Korean Academy of Periodontology's (KAP) online question and answer (Q&A) section.</p><p><strong>Methods: </strong>In this comparative cross-sectional study, we analyzed 219 patient-submitted periodontal and implant knowledge questions from the KAP online Q&A section. A panel of 10 evaluators-5 periodontists and 5 laypersons-rated both the periodontist's and the AI chatbot's responses using standardized scales. We applied the <i>t</i>-test and Spearman correlation coefficients to compare response quality, empathy, consistency, and evaluator preferences.</p><p><strong>Results: </strong>Ten evaluators judged the AI chatbot's responses to be significantly superior in quality and empathy compared to periodontist replies. A higher proportion of periodontist responses fell below acceptable quality (\\\"very poor\\\" or \\\"poor\\\") than chatbot responses (28.7% vs. 15.0%; <i>P</i><0.001), and more chatbot replies were rated \\\"empathetic\\\" or \\\"very empathetic\\\" (62.5% vs. 42.8%; <i>P</i><0.001). Overall response consistency was deemed satisfactory at 64.2%, with no significant difference in consistency or preference between periodontist and lay evaluators.</p><p><strong>Conclusions: </strong>AI-powered chatbots can deliver more accurate and empathetic answers than human periodontists, suggesting their potential role as consultation assistants merits further investigation. The high intraclass correlation coefficient values (0.79-0.93) indicate a high level of agreement among evaluators in both the periodontist and lay evaluator groups, thus confirming the reliability and robustness of the study's assessment methodology.</p>\",\"PeriodicalId\":48795,\"journal\":{\"name\":\"Journal of Periodontal and Implant Science\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2025-05-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Periodontal and Implant Science\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.5051/jpis.2402220111\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"DENTISTRY, ORAL SURGERY & MEDICINE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Periodontal and Implant Science","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.5051/jpis.2402220111","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"DENTISTRY, ORAL SURGERY & MEDICINE","Score":null,"Total":0}
Evaluating the quality and empathy of responses to patient questions on the Korean Academy of Periodontology's online question and answer section: a cross-sectional study comparing periodontists and an AI-powered chatbot.
Purpose: This study aimed to evaluate and compare the responses of an artificial intelligence (AI)-powered chatbot and professional periodontists to patient queries in periodontology and implantology, using the Korean Academy of Periodontology's (KAP) online question and answer (Q&A) section.
Methods: In this comparative cross-sectional study, we analyzed 219 patient-submitted periodontal and implant knowledge questions from the KAP online Q&A section. A panel of 10 evaluators-5 periodontists and 5 laypersons-rated both the periodontist's and the AI chatbot's responses using standardized scales. We applied the t-test and Spearman correlation coefficients to compare response quality, empathy, consistency, and evaluator preferences.
Results: Ten evaluators judged the AI chatbot's responses to be significantly superior in quality and empathy compared to periodontist replies. A higher proportion of periodontist responses fell below acceptable quality ("very poor" or "poor") than chatbot responses (28.7% vs. 15.0%; P<0.001), and more chatbot replies were rated "empathetic" or "very empathetic" (62.5% vs. 42.8%; P<0.001). Overall response consistency was deemed satisfactory at 64.2%, with no significant difference in consistency or preference between periodontist and lay evaluators.
Conclusions: AI-powered chatbots can deliver more accurate and empathetic answers than human periodontists, suggesting their potential role as consultation assistants merits further investigation. The high intraclass correlation coefficient values (0.79-0.93) indicate a high level of agreement among evaluators in both the periodontist and lay evaluator groups, thus confirming the reliability and robustness of the study's assessment methodology.
期刊介绍:
Journal of Periodontal & Implant Science (JPIS) is a peer-reviewed and open-access journal providing up-to-date information relevant to professionalism of periodontology and dental implantology. JPIS is dedicated to global and extensive publication which includes evidence-based original articles, and fundamental reviews in order to cover a variety of interests in the field of periodontal as well as implant science.