Artificial intelligence in medical device software and high-risk medical devices - a review of definitions, expert recommendations and regulatory initiatives.
Alan G Fraser, Elisabetta Biasin, Bart Bijnens, Nico Bruining, Enrico G Caiani, Koen Cobbaert, Rhodri H Davies, Stephen H Gilbert, Leo Hovestadt, Erik Kamenjasevic, Zuzanna Kwade, Gearóid McGauran, Gearóid O'Connor, Baptiste Vasey, Frank E Rademakers
{"title":"Artificial intelligence in medical device software and high-risk medical devices - a review of definitions, expert recommendations and regulatory initiatives.","authors":"Alan G Fraser, Elisabetta Biasin, Bart Bijnens, Nico Bruining, Enrico G Caiani, Koen Cobbaert, Rhodri H Davies, Stephen H Gilbert, Leo Hovestadt, Erik Kamenjasevic, Zuzanna Kwade, Gearóid McGauran, Gearóid O'Connor, Baptiste Vasey, Frank E Rademakers","doi":"10.1080/17434440.2023.2184685","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Artificial intelligence (AI) encompasses a wide range of algorithms with risks when used to support decisions about diagnosis or treatment, so professional and regulatory bodies are recommending how they should be managed.</p><p><strong>Areas covered: </strong>AI systems may qualify as standalone medical device software (MDSW) or be embedded within a medical device. Within the European Union (EU) AI software must undergo a conformity assessment procedure to be approved as a medical device. The draft EU Regulation on AI proposes rules that will apply across industry sectors, while for devices the Medical Device Regulation also applies. In the CORE-MD project (Coordinating Research and Evidence for Medical Devices), we have surveyed definitions and summarize initiatives made by professional consensus groups, regulators, and standardization bodies.</p><p><strong>Expert opinion: </strong>The level of clinical evidence required should be determined according to each application and to legal and methodological factors that contribute to risk, including accountability, transparency, and interpretability. EU guidance for MDSW based on international recommendations does not yet describe the clinical evidence needed for medical AI software. Regulators, notified bodies, manufacturers, clinicians and patients would all benefit from common standards for the clinical evaluation of high-risk AI applications and transparency of their evidence and performance.</p>","PeriodicalId":12330,"journal":{"name":"Expert Review of Medical Devices","volume":"20 6","pages":"467-491"},"PeriodicalIF":2.9000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Review of Medical Devices","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1080/17434440.2023.2184685","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 4
Abstract
Introduction: Artificial intelligence (AI) encompasses a wide range of algorithms with risks when used to support decisions about diagnosis or treatment, so professional and regulatory bodies are recommending how they should be managed.
Areas covered: AI systems may qualify as standalone medical device software (MDSW) or be embedded within a medical device. Within the European Union (EU) AI software must undergo a conformity assessment procedure to be approved as a medical device. The draft EU Regulation on AI proposes rules that will apply across industry sectors, while for devices the Medical Device Regulation also applies. In the CORE-MD project (Coordinating Research and Evidence for Medical Devices), we have surveyed definitions and summarize initiatives made by professional consensus groups, regulators, and standardization bodies.
Expert opinion: The level of clinical evidence required should be determined according to each application and to legal and methodological factors that contribute to risk, including accountability, transparency, and interpretability. EU guidance for MDSW based on international recommendations does not yet describe the clinical evidence needed for medical AI software. Regulators, notified bodies, manufacturers, clinicians and patients would all benefit from common standards for the clinical evaluation of high-risk AI applications and transparency of their evidence and performance.
期刊介绍:
The journal serves the device research community by providing a comprehensive body of high-quality information from leading experts, all subject to rigorous peer review. The Expert Review format is specially structured to optimize the value of the information to reader. Comprehensive coverage by each author in a key area of research or clinical practice is augmented by the following sections:
Expert commentary - a personal view on the most effective or promising strategies
Five-year view - a clear perspective of future prospects within a realistic timescale
Key issues - an executive summary cutting to the author''s most critical points
In addition to the Review program, each issue also features Medical Device Profiles - objective assessments of specific devices in development or clinical use to help inform clinical practice. There are also Perspectives - overviews highlighting areas of current debate and controversy, together with reports from the conference scene and invited Editorials.