Alex Moore, B. Orset, A. Yassaee, Benjamin Irving, Davide Morelli
{"title":"HEalthRecordBERT (HERBERT): leveraging transformers on electronic health records for chronic kidney disease risk stratification","authors":"Alex Moore, B. Orset, A. Yassaee, Benjamin Irving, Davide Morelli","doi":"10.1145/3665899","DOIUrl":null,"url":null,"abstract":"Risk stratification is an essential tool in the fight against many diseases, including chronic kidney disease. Recent work has focused on applying techniques from machine learning and leveraging the information contained in a patient’s electronic health record (EHR). Irregular intervals between data entries and the large number of variables tracked in EHR datasets can make them challenging to work with. Many of the difficulties associated with these datasets can be overcome by using large language models, such as bidirectional encoder representations from transformers (BERT). Previous attempts to apply BERT to EHR for risk stratification have shown promise. In this work we propose HERBERT, a novel application of BERT to EHR data. We identify two key areas where BERT models must be modified to adapt them to EHR data, namely: the embedding layer and the pretraining task. We show how changes to these can lead to improved performance, relative to the previous state of the art. We evaluate our model by predicting the transition of chronic kidney disease patients to end stage renal disease. The strong performance of our model justifies our architectural changes and suggests that large language models could play an important role in future renal risk stratification.","PeriodicalId":72043,"journal":{"name":"ACM transactions on computing for healthcare","volume":"115 46","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM transactions on computing for healthcare","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3665899","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Risk stratification is an essential tool in the fight against many diseases, including chronic kidney disease. Recent work has focused on applying techniques from machine learning and leveraging the information contained in a patient’s electronic health record (EHR). Irregular intervals between data entries and the large number of variables tracked in EHR datasets can make them challenging to work with. Many of the difficulties associated with these datasets can be overcome by using large language models, such as bidirectional encoder representations from transformers (BERT). Previous attempts to apply BERT to EHR for risk stratification have shown promise. In this work we propose HERBERT, a novel application of BERT to EHR data. We identify two key areas where BERT models must be modified to adapt them to EHR data, namely: the embedding layer and the pretraining task. We show how changes to these can lead to improved performance, relative to the previous state of the art. We evaluate our model by predicting the transition of chronic kidney disease patients to end stage renal disease. The strong performance of our model justifies our architectural changes and suggests that large language models could play an important role in future renal risk stratification.