{"title":"利用可自我解释的界面诊断糖尿病的新型机器学习方法","authors":"Gangani Dharmarathne , Thilini N. Jayasinghe , Madhusha Bogahawaththa , D.P.P. Meddage , Upaka Rathnayake","doi":"10.1016/j.health.2024.100301","DOIUrl":null,"url":null,"abstract":"<div><p>This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP’s local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.</p></div>","PeriodicalId":73222,"journal":{"name":"Healthcare analytics (New York, N.Y.)","volume":"5 ","pages":"Article 100301"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772442524000030/pdfft?md5=494bc571d60d347c01d68d0c317c4288&pid=1-s2.0-S2772442524000030-main.pdf","citationCount":"0","resultStr":"{\"title\":\"A novel machine learning approach for diagnosing diabetes with a self-explainable interface\",\"authors\":\"Gangani Dharmarathne , Thilini N. Jayasinghe , Madhusha Bogahawaththa , D.P.P. Meddage , Upaka Rathnayake\",\"doi\":\"10.1016/j.health.2024.100301\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP’s local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.</p></div>\",\"PeriodicalId\":73222,\"journal\":{\"name\":\"Healthcare analytics (New York, N.Y.)\",\"volume\":\"5 \",\"pages\":\"Article 100301\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2772442524000030/pdfft?md5=494bc571d60d347c01d68d0c317c4288&pid=1-s2.0-S2772442524000030-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Healthcare analytics (New York, N.Y.)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2772442524000030\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Healthcare analytics (New York, N.Y.)","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772442524000030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A novel machine learning approach for diagnosing diabetes with a self-explainable interface
This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP’s local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.