A. N. Balandina, B. V. Gruzdev, N. A. Savelev, Y. S. Budakyan, S. I. Kisil, A. R. Bogdanov, E. A. Grachev
{"title":"A Transformer Architecture for Risk Analysis of Group Effects of Food Nutrients","authors":"A. N. Balandina, B. V. Gruzdev, N. A. Savelev, Y. S. Budakyan, S. I. Kisil, A. R. Bogdanov, E. A. Grachev","doi":"10.3103/S0027134924702291","DOIUrl":null,"url":null,"abstract":"<p>In medicine, context is crucial for accurate patient diagnosis, as the same indicator can have different implications based on its setting. Transformer architecture models have not yet been applied to analyze nutritional data in patient histories. These models offer significant advantages for biomedical analysis, such as considering global context, interpreting attention weights, and generating informative input vectors. The attention mechanism’s ability to uncover multifactorial relationships can help physicians save time and concentrate on specific patterns identified by the neural network. This study adapted the encoder transformer for tabular data, applying it to classify metabolic disorders in patient history. Research into applying transformer architecture to tabular and dietary data shows great promise, yielding results that align with established medical findings while introducing innovative methods for utilizing attention and vectorizing this data.</p>","PeriodicalId":711,"journal":{"name":"Moscow University Physics Bulletin","volume":"79 2 supplement","pages":"S828 - S843"},"PeriodicalIF":0.4000,"publicationDate":"2025-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Moscow University Physics Bulletin","FirstCategoryId":"101","ListUrlMain":"https://link.springer.com/article/10.3103/S0027134924702291","RegionNum":4,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
In medicine, context is crucial for accurate patient diagnosis, as the same indicator can have different implications based on its setting. Transformer architecture models have not yet been applied to analyze nutritional data in patient histories. These models offer significant advantages for biomedical analysis, such as considering global context, interpreting attention weights, and generating informative input vectors. The attention mechanism’s ability to uncover multifactorial relationships can help physicians save time and concentrate on specific patterns identified by the neural network. This study adapted the encoder transformer for tabular data, applying it to classify metabolic disorders in patient history. Research into applying transformer architecture to tabular and dietary data shows great promise, yielding results that align with established medical findings while introducing innovative methods for utilizing attention and vectorizing this data.
期刊介绍:
Moscow University Physics Bulletin publishes original papers (reviews, articles, and brief communications) in the following fields of experimental and theoretical physics: theoretical and mathematical physics; physics of nuclei and elementary particles; radiophysics, electronics, acoustics; optics and spectroscopy; laser physics; condensed matter physics; chemical physics, physical kinetics, and plasma physics; biophysics and medical physics; astronomy, astrophysics, and cosmology; physics of the Earth’s, atmosphere, and hydrosphere.