{"title":"ChatGPT 是否了解急性冠状动脉综合征和相关的欧洲心脏病学会指南?","authors":"Dogac C Gurbuz, Eser Varis","doi":"10.23736/S2724-5683.24.06517-7","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Advancements in artificial intelligence are being seen in multiple fields, including medicine, and this trend is likely to continue going forward. To analyze the accuracy and reproducibility of ChatGPT answers about acute coronary syndromes (ACS).</p><p><strong>Methods: </strong>The questions asked to ChatGPT were prepared in two categories. A list of frequently asked questions (FAQs) created from inquiries asked by the public and while preparing the scientific question list, 2023 European Society of Cardiology (ESC) Guidelines for the management of ACS and ESC Clinical Practice Guidelines were used. Accuracy and reproducibility of ChatGPT responses about ACS were evaluated by two cardiologists with ten years of experience using Global Quality Score (GQS).</p><p><strong>Results: </strong>Eventually, 72 FAQs related to ACS met the study inclusion criteria. In total, 65 (90.3%) ChatGPT answers scored GQS 5, which indicated highest accuracy and proficiency. None of the ChatGPT responses to FAQs about ACS scored GQS 1. In addition, highest accuracy and reliability of ChatGPT answers was obtained for the prevention and lifestyle section with GQS 5 for 19 (95%) answers, and GQS 4 for 1 (5%) answer. In contrast, accuracy and proficiency of ChatGPT answers were lowest for the treatment and management section. Moreover, 68 (88.3%) ChatGPT responses for guideline based questions scored GQS 5. Reproducibility of ChatGPT answers was 94.4% for FAQs and 90.9% for ESC guidelines questions.</p><p><strong>Conclusions: </strong>This study shows for the first time that ChatGPT can give accurate and sufficient responses to more than 90% of FAQs about ACS. In addition, proficiency and correctness of ChatGPT answers about questions depending on ESC guidelines was also substantial.</p>","PeriodicalId":18668,"journal":{"name":"Minerva cardiology and angiology","volume":null,"pages":null},"PeriodicalIF":1.4000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Is ChatGPT knowledgeable of acute coronary syndromes and pertinent European Society of Cardiology Guidelines?\",\"authors\":\"Dogac C Gurbuz, Eser Varis\",\"doi\":\"10.23736/S2724-5683.24.06517-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Advancements in artificial intelligence are being seen in multiple fields, including medicine, and this trend is likely to continue going forward. To analyze the accuracy and reproducibility of ChatGPT answers about acute coronary syndromes (ACS).</p><p><strong>Methods: </strong>The questions asked to ChatGPT were prepared in two categories. A list of frequently asked questions (FAQs) created from inquiries asked by the public and while preparing the scientific question list, 2023 European Society of Cardiology (ESC) Guidelines for the management of ACS and ESC Clinical Practice Guidelines were used. Accuracy and reproducibility of ChatGPT responses about ACS were evaluated by two cardiologists with ten years of experience using Global Quality Score (GQS).</p><p><strong>Results: </strong>Eventually, 72 FAQs related to ACS met the study inclusion criteria. In total, 65 (90.3%) ChatGPT answers scored GQS 5, which indicated highest accuracy and proficiency. None of the ChatGPT responses to FAQs about ACS scored GQS 1. In addition, highest accuracy and reliability of ChatGPT answers was obtained for the prevention and lifestyle section with GQS 5 for 19 (95%) answers, and GQS 4 for 1 (5%) answer. In contrast, accuracy and proficiency of ChatGPT answers were lowest for the treatment and management section. Moreover, 68 (88.3%) ChatGPT responses for guideline based questions scored GQS 5. Reproducibility of ChatGPT answers was 94.4% for FAQs and 90.9% for ESC guidelines questions.</p><p><strong>Conclusions: </strong>This study shows for the first time that ChatGPT can give accurate and sufficient responses to more than 90% of FAQs about ACS. In addition, proficiency and correctness of ChatGPT answers about questions depending on ESC guidelines was also substantial.</p>\",\"PeriodicalId\":18668,\"journal\":{\"name\":\"Minerva cardiology and angiology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2024-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Minerva cardiology and angiology\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.23736/S2724-5683.24.06517-7\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/2/23 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"CARDIAC & CARDIOVASCULAR SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Minerva cardiology and angiology","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.23736/S2724-5683.24.06517-7","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/2/23 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"CARDIAC & CARDIOVASCULAR SYSTEMS","Score":null,"Total":0}
Is ChatGPT knowledgeable of acute coronary syndromes and pertinent European Society of Cardiology Guidelines?
Background: Advancements in artificial intelligence are being seen in multiple fields, including medicine, and this trend is likely to continue going forward. To analyze the accuracy and reproducibility of ChatGPT answers about acute coronary syndromes (ACS).
Methods: The questions asked to ChatGPT were prepared in two categories. A list of frequently asked questions (FAQs) created from inquiries asked by the public and while preparing the scientific question list, 2023 European Society of Cardiology (ESC) Guidelines for the management of ACS and ESC Clinical Practice Guidelines were used. Accuracy and reproducibility of ChatGPT responses about ACS were evaluated by two cardiologists with ten years of experience using Global Quality Score (GQS).
Results: Eventually, 72 FAQs related to ACS met the study inclusion criteria. In total, 65 (90.3%) ChatGPT answers scored GQS 5, which indicated highest accuracy and proficiency. None of the ChatGPT responses to FAQs about ACS scored GQS 1. In addition, highest accuracy and reliability of ChatGPT answers was obtained for the prevention and lifestyle section with GQS 5 for 19 (95%) answers, and GQS 4 for 1 (5%) answer. In contrast, accuracy and proficiency of ChatGPT answers were lowest for the treatment and management section. Moreover, 68 (88.3%) ChatGPT responses for guideline based questions scored GQS 5. Reproducibility of ChatGPT answers was 94.4% for FAQs and 90.9% for ESC guidelines questions.
Conclusions: This study shows for the first time that ChatGPT can give accurate and sufficient responses to more than 90% of FAQs about ACS. In addition, proficiency and correctness of ChatGPT answers about questions depending on ESC guidelines was also substantial.