GPT-4 as a Source of Patient Information for Anterior Cervical Discectomy and Fusion: A Comparative Analysis Against Google Web Search.

IF 4.6 Q2 MATERIALS SCIENCE, BIOMATERIALS
ACS Applied Bio Materials Pub Date : 2024-11-01 Epub Date: 2024-03-21 DOI:10.1177/21925682241241241
Paul G Mastrokostas, Leonidas E Mastrokostas, Ahmed K Emara, Ian J Wellington, Elizabeth Ginalis, John K Houten, Amrit S Khalsa, Ahmed Saleh, Afshin E Razi, Mitchell K Ng
{"title":"GPT-4 as a Source of Patient Information for Anterior Cervical Discectomy and Fusion: A Comparative Analysis Against Google Web Search.","authors":"Paul G Mastrokostas, Leonidas E Mastrokostas, Ahmed K Emara, Ian J Wellington, Elizabeth Ginalis, John K Houten, Amrit S Khalsa, Ahmed Saleh, Afshin E Razi, Mitchell K Ng","doi":"10.1177/21925682241241241","DOIUrl":null,"url":null,"abstract":"<p><strong>Study design: </strong>Comparative study.</p><p><strong>Objectives: </strong>This study aims to compare Google and GPT-4 in terms of (1) question types, (2) response readability, (3) source quality, and (4) numerical response accuracy for the top 10 most frequently asked questions (FAQs) about anterior cervical discectomy and fusion (ACDF).</p><p><strong>Methods: </strong>\"Anterior cervical discectomy and fusion\" was searched on Google and GPT-4 on December 18, 2023. Top 10 FAQs were classified according to the Rothwell system. Source quality was evaluated using <i>JAMA</i> benchmark criteria and readability was assessed using Flesch Reading Ease and Flesch-Kincaid grade level. Differences in <i>JAMA</i> scores, Flesch-Kincaid grade level, Flesch Reading Ease, and word count between platforms were analyzed using Student's t-tests. Statistical significance was set at the .05 level.</p><p><strong>Results: </strong>Frequently asked questions from Google were varied, while GPT-4 focused on technical details and indications/management. GPT-4 showed a higher Flesch-Kincaid grade level (12.96 vs 9.28, <i>P</i> = .003), lower Flesch Reading Ease score (37.07 vs 54.85, <i>P</i> = .005), and higher <i>JAMA</i> scores for source quality (3.333 vs 1.800, <i>P</i> = .016). Numerically, 6 out of 10 responses varied between platforms, with GPT-4 providing broader recovery timelines for ACDF.</p><p><strong>Conclusions: </strong>This study demonstrates GPT-4's ability to elevate patient education by providing high-quality, diverse information tailored to those with advanced literacy levels. As AI technology evolves, refining these tools for accuracy and user-friendliness remains crucial, catering to patients' varying literacy levels and information needs in spine surgery.</p>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11529100/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1177/21925682241241241","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/3/21 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0

Abstract

Study design: Comparative study.

Objectives: This study aims to compare Google and GPT-4 in terms of (1) question types, (2) response readability, (3) source quality, and (4) numerical response accuracy for the top 10 most frequently asked questions (FAQs) about anterior cervical discectomy and fusion (ACDF).

Methods: "Anterior cervical discectomy and fusion" was searched on Google and GPT-4 on December 18, 2023. Top 10 FAQs were classified according to the Rothwell system. Source quality was evaluated using JAMA benchmark criteria and readability was assessed using Flesch Reading Ease and Flesch-Kincaid grade level. Differences in JAMA scores, Flesch-Kincaid grade level, Flesch Reading Ease, and word count between platforms were analyzed using Student's t-tests. Statistical significance was set at the .05 level.

Results: Frequently asked questions from Google were varied, while GPT-4 focused on technical details and indications/management. GPT-4 showed a higher Flesch-Kincaid grade level (12.96 vs 9.28, P = .003), lower Flesch Reading Ease score (37.07 vs 54.85, P = .005), and higher JAMA scores for source quality (3.333 vs 1.800, P = .016). Numerically, 6 out of 10 responses varied between platforms, with GPT-4 providing broader recovery timelines for ACDF.

Conclusions: This study demonstrates GPT-4's ability to elevate patient education by providing high-quality, diverse information tailored to those with advanced literacy levels. As AI technology evolves, refining these tools for accuracy and user-friendliness remains crucial, catering to patients' varying literacy levels and information needs in spine surgery.

GPT-4 作为颈椎前路切除术和融合术的患者信息来源:与谷歌网络搜索的对比分析。
研究设计比较研究:本研究旨在从以下几个方面对谷歌和 GPT-4 进行比较:(1) 问题类型;(2) 回答的可读性;(3) 来源质量;(4) 关于颈椎前路椎间盘切除和融合术(ACDF)的前 10 个最常见问题(FAQ)的数字回答准确性:2023年12月18日,在谷歌和GPT-4上搜索 "颈椎前路椎间盘切除术和融合术"。根据 Rothwell 系统对排名前 10 的常见问题进行了分类。采用《美国医学会杂志》基准标准评估来源质量,采用 Flesch Reading Ease 和 Flesch-Kincaid grade level 评估可读性。采用学生 t 检验法分析了不同平台之间在 JAMA 分数、Flesch-Kincaid 等级、Flesch 阅读易读性和字数方面的差异。统计显著性设定为 0.05:结果:谷歌的常见问题多种多样,而 GPT-4 则侧重于技术细节和适应症/管理。GPT-4显示出更高的Flesch-Kincaid等级(12.96 vs 9.28,P = .003)、更低的Flesch阅读容易度得分(37.07 vs 54.85,P = .005)和更高的JAMA来源质量得分(3.333 vs 1.800,P = .016)。从数字上看,10 个回答中有 6 个在平台之间存在差异,GPT-4 为 ACDF 提供了更宽的恢复时间:本研究表明,GPT-4 有能力通过为文化水平较高的患者提供高质量、多样化的信息来提升患者教育水平。随着人工智能技术的发展,针对脊柱手术中患者不同的文化水平和信息需求,完善这些工具的准确性和用户友好性仍然至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
ACS Applied Bio Materials
ACS Applied Bio Materials Chemistry-Chemistry (all)
CiteScore
9.40
自引率
2.10%
发文量
464
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信