{"title":"用于临时嵌入运行状况结果预测的解码器转换器","authors":"O. Boursalie, Reza Samavi, T. Doyle","doi":"10.1109/ICMLA52953.2021.00235","DOIUrl":null,"url":null,"abstract":"Deep learning models are increasingly being used to predict patients’ diagnoses by analyzing electronic health records. Medical records represent observations of a patient’s health over time. A commonly used approach to analyze health records is to encode them as a sequence of ordered diagnoses (diagnostic-level encoding). Transformer models then analyze the sequence of diagnoses to learn disease patterns. However, the elapsed time between medical visits is not considered when transformers are used to analyze health records. In this paper, we present DT-THRE: Decoder Transformer for Temporally-Embedded Health Records Encoding that predicts patients’ diagnoses by analyzing their medical histories. In DTTHRE, instead of diagnostic-level encoding, we propose an encoding representation for health records called THRE: Temporally-Embedded Health Records Encoding. THRE encodes patient histories as a sequence of medical events such as age, sex, and diagnostic embedding while incorporating the elapsed time between visits. We evaluate a proof-of-concept DTTHRE on a real-world medical dataset and compare our model’s performance to an existing diagnostic transformer model in the literature. DTTHRE was successful on a medical dataset to predict patients’ final diagnosis with improved predictive performance (78.54± 0.22%) compared to the existing model in the literature (40.51± 0.13%).","PeriodicalId":6750,"journal":{"name":"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"6 1","pages":"1461-1467"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Decoder Transformer for Temporally-Embedded Health Outcome Predictions\",\"authors\":\"O. Boursalie, Reza Samavi, T. Doyle\",\"doi\":\"10.1109/ICMLA52953.2021.00235\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep learning models are increasingly being used to predict patients’ diagnoses by analyzing electronic health records. Medical records represent observations of a patient’s health over time. A commonly used approach to analyze health records is to encode them as a sequence of ordered diagnoses (diagnostic-level encoding). Transformer models then analyze the sequence of diagnoses to learn disease patterns. However, the elapsed time between medical visits is not considered when transformers are used to analyze health records. In this paper, we present DT-THRE: Decoder Transformer for Temporally-Embedded Health Records Encoding that predicts patients’ diagnoses by analyzing their medical histories. In DTTHRE, instead of diagnostic-level encoding, we propose an encoding representation for health records called THRE: Temporally-Embedded Health Records Encoding. THRE encodes patient histories as a sequence of medical events such as age, sex, and diagnostic embedding while incorporating the elapsed time between visits. We evaluate a proof-of-concept DTTHRE on a real-world medical dataset and compare our model’s performance to an existing diagnostic transformer model in the literature. DTTHRE was successful on a medical dataset to predict patients’ final diagnosis with improved predictive performance (78.54± 0.22%) compared to the existing model in the literature (40.51± 0.13%).\",\"PeriodicalId\":6750,\"journal\":{\"name\":\"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"volume\":\"6 1\",\"pages\":\"1461-1467\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLA52953.2021.00235\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA52953.2021.00235","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Decoder Transformer for Temporally-Embedded Health Outcome Predictions
Deep learning models are increasingly being used to predict patients’ diagnoses by analyzing electronic health records. Medical records represent observations of a patient’s health over time. A commonly used approach to analyze health records is to encode them as a sequence of ordered diagnoses (diagnostic-level encoding). Transformer models then analyze the sequence of diagnoses to learn disease patterns. However, the elapsed time between medical visits is not considered when transformers are used to analyze health records. In this paper, we present DT-THRE: Decoder Transformer for Temporally-Embedded Health Records Encoding that predicts patients’ diagnoses by analyzing their medical histories. In DTTHRE, instead of diagnostic-level encoding, we propose an encoding representation for health records called THRE: Temporally-Embedded Health Records Encoding. THRE encodes patient histories as a sequence of medical events such as age, sex, and diagnostic embedding while incorporating the elapsed time between visits. We evaluate a proof-of-concept DTTHRE on a real-world medical dataset and compare our model’s performance to an existing diagnostic transformer model in the literature. DTTHRE was successful on a medical dataset to predict patients’ final diagnosis with improved predictive performance (78.54± 0.22%) compared to the existing model in the literature (40.51± 0.13%).