Health Literacy Analytics of Accessible Patient Resources in Cardiovascular Medicine: What are Patients Wanting to Know?

Kansas journal of medicine Pub Date : 2023-12-31 eCollection Date: 2023-01-01 DOI:10.17161/kjm.vol16.20554
Som P Singh, Aarya Ramprasad, Anh Luu, Rohma Zaidi, Zoya Siddiqui, Trung Pham
{"title":"Health Literacy Analytics of Accessible Patient Resources in Cardiovascular Medicine: What are Patients Wanting to Know?","authors":"Som P Singh, Aarya Ramprasad, Anh Luu, Rohma Zaidi, Zoya Siddiqui, Trung Pham","doi":"10.17161/kjm.vol16.20554","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>There remains an increasing utilization of internet-based resources as a first line of medical knowledge. Among patients with cardiovascular disease, these resources often are relied upon for numerous diagnostic and therapeutic modalities. However, the reliability of this information is not fully understood. The aim of this study was to provide a descriptive profile on the literacy quality, readability, and transparency of publicly available educational resources in cardiology.</p><p><strong>Methods: </strong>The frequently asked questions and associated online educational articles on common cardiovascular diagnostic and therapeutic interventions were investigated using publicly available data from the Google RankBrain machine learning algorithm after applying inclusion and exclusion criteria. Independent raters evaluated questions for Rothwell's Classification and readability calculations.</p><p><strong>Results: </strong>Collectively, 520 questions and articles were evaluated across 13 cardiac interventions, resulting in 3,120 readability scores. The sources of articles were most frequently from academic institutions followed by commercial sources. Most questions were classified as \"Fact\" at 76.0% (n = 395), and questions regarding \"Technical Details\" of each intervention were the most common subclassification at 56.3% (n = 293).</p><p><strong>Conclusions: </strong>Our data show that patients most often are using online search query programs to seek information regarding specific knowledge of each cardiovascular intervention rather than form an evaluation of the intervention. Additionally, these online patient educational resources continue to not meet grade-level reading recommendations.</p>","PeriodicalId":94121,"journal":{"name":"Kansas journal of medicine","volume":"16 ","pages":"309-315"},"PeriodicalIF":0.0000,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10829858/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Kansas journal of medicine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.17161/kjm.vol16.20554","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Introduction: There remains an increasing utilization of internet-based resources as a first line of medical knowledge. Among patients with cardiovascular disease, these resources often are relied upon for numerous diagnostic and therapeutic modalities. However, the reliability of this information is not fully understood. The aim of this study was to provide a descriptive profile on the literacy quality, readability, and transparency of publicly available educational resources in cardiology.

Methods: The frequently asked questions and associated online educational articles on common cardiovascular diagnostic and therapeutic interventions were investigated using publicly available data from the Google RankBrain machine learning algorithm after applying inclusion and exclusion criteria. Independent raters evaluated questions for Rothwell's Classification and readability calculations.

Results: Collectively, 520 questions and articles were evaluated across 13 cardiac interventions, resulting in 3,120 readability scores. The sources of articles were most frequently from academic institutions followed by commercial sources. Most questions were classified as "Fact" at 76.0% (n = 395), and questions regarding "Technical Details" of each intervention were the most common subclassification at 56.3% (n = 293).

Conclusions: Our data show that patients most often are using online search query programs to seek information regarding specific knowledge of each cardiovascular intervention rather than form an evaluation of the intervention. Additionally, these online patient educational resources continue to not meet grade-level reading recommendations.

心血管内科患者可获取资源的健康素养分析:患者想知道什么?
导言:互联网资源作为第一线医学知识的利用率越来越高。在心血管疾病患者中,许多诊断和治疗方法都依赖于这些资源。然而,人们对这些信息的可靠性并不完全了解。本研究的目的是对公开可用的心脏病学教育资源的识字质量、可读性和透明度进行描述性剖析:方法:采用谷歌RankBrain机器学习算法的公开数据,在应用纳入和排除标准后,对常见心血管诊断和治疗干预的常见问题和相关在线教育文章进行了调查。独立评分员对问题进行了罗斯威尔分类和可读性计算:结果:共对 13 项心脏介入治疗中的 520 个问题和文章进行了评估,得出了 3120 个可读性评分。文章来源最常见的是学术机构,其次是商业来源。大多数问题被归类为 "事实",占 76.0%(n = 395),而有关每种干预措施的 "技术细节 "的问题是最常见的子分类,占 56.3%(n = 293):我们的数据显示,患者最常使用在线搜索查询程序来寻求有关每种心血管干预措施的具体知识信息,而不是对干预措施进行评估。此外,这些在线患者教育资源仍然不符合年级阅读建议。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信