Ezer H Benaim, Samuel P O'Rourke, Margaret T Dillon
{"title":"What Do People Want to Know About Cochlear Implants: A Google Analytic Study.","authors":"Ezer H Benaim, Samuel P O'Rourke, Margaret T Dillon","doi":"10.1002/lary.31741","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>Identify the questions most frequently asked online about cochlear implants (CI) and assess the readability and quality of the content.</p><p><strong>Methods: </strong>A Google search engine observational study was conducted via a search response optimization (SEO) tool. The SEO tool listed the questions generated by Google's \"People Also Ask\" (PAA) feature for the search queries \"cochlear implant\" and \"cochlear implant surgery.\" The top 50 PAA questions for each query were conceptually classified. Sourced websites were evaluated for readability, transparency and information quality, and ability to answer the question. Readability and accuracy in answering questions were also compared to the responses from ChatGPT 3.5.</p><p><strong>Results: </strong>The PAA questions were commonly related to technical details (21%), surgical factors (18%), and postoperative experiences (12%). Sourced websites mainly were from academic institutions, followed by commercial companies. Among all types of websites, readability, on average, did not meet the recommended standard for health-related patient education materials. Only two websites were at or below the 8th-grade level. Responses by ChatGPT had significantly poorer readability compared to the websites (p < 0.001). These online resources were not significantly different in the percentage of accurately answering the questions (websites: 78%, ChatGPT: 85%, p = 0.136).</p><p><strong>Conclusions: </strong>The most searched topics were technical details about devices, surgical factors, and the postoperative experience. Unfortunately, most websites did not meet the ideal criteria of readability, quality, and credibility for patient education. These results highlight potential knowledge gaps for patients, deficits in current online education materials, and possible tools to better support CI candidate decision-making.</p><p><strong>Level of evidence: </strong>NA Laryngoscope, 135:840-847, 2025.</p>","PeriodicalId":49921,"journal":{"name":"Laryngoscope","volume":" ","pages":"840-847"},"PeriodicalIF":2.2000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11729566/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Laryngoscope","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1002/lary.31741","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/8/27 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"MEDICINE, RESEARCH & EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
Abstract
Objective: Identify the questions most frequently asked online about cochlear implants (CI) and assess the readability and quality of the content.
Methods: A Google search engine observational study was conducted via a search response optimization (SEO) tool. The SEO tool listed the questions generated by Google's "People Also Ask" (PAA) feature for the search queries "cochlear implant" and "cochlear implant surgery." The top 50 PAA questions for each query were conceptually classified. Sourced websites were evaluated for readability, transparency and information quality, and ability to answer the question. Readability and accuracy in answering questions were also compared to the responses from ChatGPT 3.5.
Results: The PAA questions were commonly related to technical details (21%), surgical factors (18%), and postoperative experiences (12%). Sourced websites mainly were from academic institutions, followed by commercial companies. Among all types of websites, readability, on average, did not meet the recommended standard for health-related patient education materials. Only two websites were at or below the 8th-grade level. Responses by ChatGPT had significantly poorer readability compared to the websites (p < 0.001). These online resources were not significantly different in the percentage of accurately answering the questions (websites: 78%, ChatGPT: 85%, p = 0.136).
Conclusions: The most searched topics were technical details about devices, surgical factors, and the postoperative experience. Unfortunately, most websites did not meet the ideal criteria of readability, quality, and credibility for patient education. These results highlight potential knowledge gaps for patients, deficits in current online education materials, and possible tools to better support CI candidate decision-making.
Level of evidence: NA Laryngoscope, 135:840-847, 2025.
期刊介绍:
The Laryngoscope has been the leading source of information on advances in the diagnosis and treatment of head and neck disorders since 1890. The Laryngoscope is the first choice among otolaryngologists for publication of their important findings and techniques. Each monthly issue of The Laryngoscope features peer-reviewed medical, clinical, and research contributions in general otolaryngology, allergy/rhinology, otology/neurotology, laryngology/bronchoesophagology, head and neck surgery, sleep medicine, pediatric otolaryngology, facial plastics and reconstructive surgery, oncology, and communicative disorders. Contributions include papers and posters presented at the Annual and Section Meetings of the Triological Society, as well as independent papers, "How I Do It", "Triological Best Practice" articles, and contemporary reviews. Theses authored by the Triological Society’s new Fellows as well as papers presented at meetings of the American Laryngological Association are published in The Laryngoscope.
• Broncho-esophagology
• Communicative disorders
• Head and neck surgery
• Plastic and reconstructive facial surgery
• Oncology
• Speech and hearing defects