{"title":"Machine Learning-based Prediction of Postoperative 30-days Mortality","authors":"Linna Wang, Linji Li, T. Zhu, Congli Ma, Li Lu","doi":"10.1145/3487075.3487130","DOIUrl":null,"url":null,"abstract":"Surgical patients aged 65 and over are facing a 2-10 times higher risk of death after surgery. Early prediction of postoperative mortality is essential, as timely and appropriate treatment can improve survival outcomes. With the development of medical and computer technology, numerous available health-related data can be recorded for research. Among various patient indicators which may affect the accuracy of prediction, it is necessary to find highly relevant and efficient features. The aims of this study were to use machine learning algorithms, specifically Bagging and Boosting Algorithms (e.g. Random Forest, eXtreme Gradient Boosting), to predict the postoperative 30-days mortality in surgical patients aged over 65, and to identify the optimal features using genetic algorithm(GA). This prospective study was developed and validated on the cohort from electronic health records (EHRs) of West China Hospital, Sichuan University, which contained 7467 surgical patients (0.924% mortality rate) who underwent surgery between July 1, 2019 and October 31, 2020. Compared with models like the traditional logistic regression model and the baseline ASA physical status, We found that XGBoost with hyper-parameters had best performance based solely on the automatically obtained features (area under the curve [AUC] of 0.9318, 95% confidence interval [CI] 0.9041 - 0.9594). The AUC of baseline ASA-PS was 0.6787 (95% CI 0.6471 - 0.7103) using XGBoost. When both ASA-PS and the selected features are included as inputs, XGboost achieved the AUC of 0.9345 (95% CI 0.9076 - 0.9613).","PeriodicalId":354966,"journal":{"name":"Proceedings of the 5th International Conference on Computer Science and Application Engineering","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Conference on Computer Science and Application Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3487075.3487130","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Surgical patients aged 65 and over are facing a 2-10 times higher risk of death after surgery. Early prediction of postoperative mortality is essential, as timely and appropriate treatment can improve survival outcomes. With the development of medical and computer technology, numerous available health-related data can be recorded for research. Among various patient indicators which may affect the accuracy of prediction, it is necessary to find highly relevant and efficient features. The aims of this study were to use machine learning algorithms, specifically Bagging and Boosting Algorithms (e.g. Random Forest, eXtreme Gradient Boosting), to predict the postoperative 30-days mortality in surgical patients aged over 65, and to identify the optimal features using genetic algorithm(GA). This prospective study was developed and validated on the cohort from electronic health records (EHRs) of West China Hospital, Sichuan University, which contained 7467 surgical patients (0.924% mortality rate) who underwent surgery between July 1, 2019 and October 31, 2020. Compared with models like the traditional logistic regression model and the baseline ASA physical status, We found that XGBoost with hyper-parameters had best performance based solely on the automatically obtained features (area under the curve [AUC] of 0.9318, 95% confidence interval [CI] 0.9041 - 0.9594). The AUC of baseline ASA-PS was 0.6787 (95% CI 0.6471 - 0.7103) using XGBoost. When both ASA-PS and the selected features are included as inputs, XGboost achieved the AUC of 0.9345 (95% CI 0.9076 - 0.9613).