Yuwen Chen, Yu-jie Li, Wei Huang, Ju Zhang, Bin Yi, Xiaolin Qin
{"title":"基于多模态信息融合的围手术期关键事件预警","authors":"Yuwen Chen, Yu-jie Li, Wei Huang, Ju Zhang, Bin Yi, Xiaolin Qin","doi":"10.1145/3484377.3484389","DOIUrl":null,"url":null,"abstract":"The occurrence of perioperative critical adverse events will affect the quality of medical services and threaten the safety of patients. Using scientific methods to assess the risk of critical illness in perioperative period is of great significance to improve the quality of medical service and ensure the safety of patients. However, the diagnosis and treatment data of perioperative patients are multi-source and irregular, and only one physiological information can not accurately reflect the patient's condition. In previous studies, it is found that a variety of physiological information can transmit the information of human health or not, which can be used to evaluate critical illness and physical condition. Therefore, this paper integrates the preoperative clinical structure data, intraoperative vital signs monitoring time series data and intraoperative anesthesia event time series data. Based on deep learning technology, the multi-modal data of patients are embedded and mapped into the same recessive semantic space to realize the real-time tracking and early warning of severe events, reduce postoperative complications and improve the early diagnosis efficiency of critical adverse events. The results showed that the performance of the model based on the multi-modal data was better than that based on the real military data.","PeriodicalId":123184,"journal":{"name":"Proceedings of the 2021 International Conference on Intelligent Medicine and Health","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Early-Warning of Peri-operative Critical Event Based on Multimodal Information Fusion\",\"authors\":\"Yuwen Chen, Yu-jie Li, Wei Huang, Ju Zhang, Bin Yi, Xiaolin Qin\",\"doi\":\"10.1145/3484377.3484389\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The occurrence of perioperative critical adverse events will affect the quality of medical services and threaten the safety of patients. Using scientific methods to assess the risk of critical illness in perioperative period is of great significance to improve the quality of medical service and ensure the safety of patients. However, the diagnosis and treatment data of perioperative patients are multi-source and irregular, and only one physiological information can not accurately reflect the patient's condition. In previous studies, it is found that a variety of physiological information can transmit the information of human health or not, which can be used to evaluate critical illness and physical condition. Therefore, this paper integrates the preoperative clinical structure data, intraoperative vital signs monitoring time series data and intraoperative anesthesia event time series data. Based on deep learning technology, the multi-modal data of patients are embedded and mapped into the same recessive semantic space to realize the real-time tracking and early warning of severe events, reduce postoperative complications and improve the early diagnosis efficiency of critical adverse events. The results showed that the performance of the model based on the multi-modal data was better than that based on the real military data.\",\"PeriodicalId\":123184,\"journal\":{\"name\":\"Proceedings of the 2021 International Conference on Intelligent Medicine and Health\",\"volume\":\"60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2021 International Conference on Intelligent Medicine and Health\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3484377.3484389\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 International Conference on Intelligent Medicine and Health","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3484377.3484389","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Early-Warning of Peri-operative Critical Event Based on Multimodal Information Fusion
The occurrence of perioperative critical adverse events will affect the quality of medical services and threaten the safety of patients. Using scientific methods to assess the risk of critical illness in perioperative period is of great significance to improve the quality of medical service and ensure the safety of patients. However, the diagnosis and treatment data of perioperative patients are multi-source and irregular, and only one physiological information can not accurately reflect the patient's condition. In previous studies, it is found that a variety of physiological information can transmit the information of human health or not, which can be used to evaluate critical illness and physical condition. Therefore, this paper integrates the preoperative clinical structure data, intraoperative vital signs monitoring time series data and intraoperative anesthesia event time series data. Based on deep learning technology, the multi-modal data of patients are embedded and mapped into the same recessive semantic space to realize the real-time tracking and early warning of severe events, reduce postoperative complications and improve the early diagnosis efficiency of critical adverse events. The results showed that the performance of the model based on the multi-modal data was better than that based on the real military data.