{"title":"Equity and AI governance at academic medical centers.","authors":"Paige Nong, Reema Hamasha, Jodyn Platt","doi":"10.37765/ajmc.2024.89555","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>To understand whether and how equity is considered in artificial intelligence/machine learning governance processes at academic medical centers.</p><p><strong>Study design: </strong>Qualitative analysis of interview data.</p><p><strong>Methods: </strong>We created a database of academic medical centers from the full list of Association of American Medical Colleges hospital and health system members in 2022. Stratifying by census region and restricting to nonfederal and nonspecialty centers, we recruited chief medical informatics officers and similarly positioned individuals from academic medical centers across the country. We created and piloted a semistructured interview guide focused on (1) how academic medical centers govern artificial intelligence and prediction and (2) to what extent equity is considered in these processes. A total of 17 individuals representing 13 institutions across 4 census regions of the US were interviewed.</p><p><strong>Results: </strong>A minority of participants reported considering inequity, racism, or bias in governance. Most participants conceptualized these issues as characteristics of a tool, using frameworks such as algorithmic bias or fairness. Fewer participants conceptualized equity beyond the technology itself and asked broader questions about its implications for patients. Disparities in health information technology resources across health systems were repeatedly identified as a threat to health equity.</p><p><strong>Conclusions: </strong>We found a lack of consistent equity consideration among academic medical centers as they develop their governance processes for predictive technologies despite considerable national attention to the ways these technologies can cause or reproduce inequities. Health systems and policy makers will need to specifically prioritize equity literacy among health system leadership, design oversight policies, and promote critical engagement with these tools and their implications to prevent the further entrenchment of inequities in digital health care.</p>","PeriodicalId":50808,"journal":{"name":"American Journal of Managed Care","volume":null,"pages":null},"PeriodicalIF":2.5000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Managed Care","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.37765/ajmc.2024.89555","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0
Abstract
Objectives: To understand whether and how equity is considered in artificial intelligence/machine learning governance processes at academic medical centers.
Study design: Qualitative analysis of interview data.
Methods: We created a database of academic medical centers from the full list of Association of American Medical Colleges hospital and health system members in 2022. Stratifying by census region and restricting to nonfederal and nonspecialty centers, we recruited chief medical informatics officers and similarly positioned individuals from academic medical centers across the country. We created and piloted a semistructured interview guide focused on (1) how academic medical centers govern artificial intelligence and prediction and (2) to what extent equity is considered in these processes. A total of 17 individuals representing 13 institutions across 4 census regions of the US were interviewed.
Results: A minority of participants reported considering inequity, racism, or bias in governance. Most participants conceptualized these issues as characteristics of a tool, using frameworks such as algorithmic bias or fairness. Fewer participants conceptualized equity beyond the technology itself and asked broader questions about its implications for patients. Disparities in health information technology resources across health systems were repeatedly identified as a threat to health equity.
Conclusions: We found a lack of consistent equity consideration among academic medical centers as they develop their governance processes for predictive technologies despite considerable national attention to the ways these technologies can cause or reproduce inequities. Health systems and policy makers will need to specifically prioritize equity literacy among health system leadership, design oversight policies, and promote critical engagement with these tools and their implications to prevent the further entrenchment of inequities in digital health care.
期刊介绍:
The American Journal of Managed Care is an independent, peer-reviewed publication dedicated to disseminating clinical information to managed care physicians, clinical decision makers, and other healthcare professionals. Its aim is to stimulate scientific communication in the ever-evolving field of managed care. The American Journal of Managed Care addresses a broad range of issues relevant to clinical decision making in a cost-constrained environment and examines the impact of clinical, management, and policy interventions and programs on healthcare and economic outcomes.