S. Damani, D. Damani, Renisha Redij, Arshia K. Sethi, Pratyusha Muddaloor, Anoushka Kapoor, Anjali Rajagopal, K. Gopalakrishnan, X. J. Wang, V. Chedid, Alexander J. Ryu, Christopher A. Aakre, S. P. Arunachalam
{"title":"一种新型人工智能辅助计算机听诊声图传感装置的设计","authors":"S. Damani, D. Damani, Renisha Redij, Arshia K. Sethi, Pratyusha Muddaloor, Anoushka Kapoor, Anjali Rajagopal, K. Gopalakrishnan, X. J. Wang, V. Chedid, Alexander J. Ryu, Christopher A. Aakre, S. P. Arunachalam","doi":"10.1115/dmd2023-7977","DOIUrl":null,"url":null,"abstract":"\n Bowel sounds have been previously used to study intestinal motility and overall digestive health in various clinical settings. However, the blurred definition of bowel sounds and their subtypes, limited resources for interpretation, poor sensitivity, and low positive predictive value led to their restricted utility. Recent advances in artificial intelligence and machine learning have steered interest in developing unique tools using the phonoenterogram to analyze diverse bowel sounds. In our study, bowel sounds were recorded from eight healthy volunteers using the Eko Duo stethoscope. A novel deep-learning algorithm was designed to classify the recordings into baseline or prominent bowel sounds. A total of 11,210 data points (5,605 balanced sounds) were used to train and test the model to yield an accuracy of 0.895, a precision of 0.890, and a recall of 0.854 reflecting successful segregation of these sounds into respective groups. More extensive studies enrolling healthy and diseased subjects with a device specifically tailored to record bowel sounds are needed to generalize these results and determine their application in the patient population.","PeriodicalId":325836,"journal":{"name":"2023 Design of Medical Devices Conference","volume":"238 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ON THE DESIGN OF A NOVEL PHONOENTEROGRAM SENSING DEVICE USING AI ASSISTED COMPUTER-AIDED AUSCULTATION\",\"authors\":\"S. Damani, D. Damani, Renisha Redij, Arshia K. Sethi, Pratyusha Muddaloor, Anoushka Kapoor, Anjali Rajagopal, K. Gopalakrishnan, X. J. Wang, V. Chedid, Alexander J. Ryu, Christopher A. Aakre, S. P. Arunachalam\",\"doi\":\"10.1115/dmd2023-7977\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Bowel sounds have been previously used to study intestinal motility and overall digestive health in various clinical settings. However, the blurred definition of bowel sounds and their subtypes, limited resources for interpretation, poor sensitivity, and low positive predictive value led to their restricted utility. Recent advances in artificial intelligence and machine learning have steered interest in developing unique tools using the phonoenterogram to analyze diverse bowel sounds. In our study, bowel sounds were recorded from eight healthy volunteers using the Eko Duo stethoscope. A novel deep-learning algorithm was designed to classify the recordings into baseline or prominent bowel sounds. A total of 11,210 data points (5,605 balanced sounds) were used to train and test the model to yield an accuracy of 0.895, a precision of 0.890, and a recall of 0.854 reflecting successful segregation of these sounds into respective groups. More extensive studies enrolling healthy and diseased subjects with a device specifically tailored to record bowel sounds are needed to generalize these results and determine their application in the patient population.\",\"PeriodicalId\":325836,\"journal\":{\"name\":\"2023 Design of Medical Devices Conference\",\"volume\":\"238 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 Design of Medical Devices Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1115/dmd2023-7977\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 Design of Medical Devices Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/dmd2023-7977","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
ON THE DESIGN OF A NOVEL PHONOENTEROGRAM SENSING DEVICE USING AI ASSISTED COMPUTER-AIDED AUSCULTATION
Bowel sounds have been previously used to study intestinal motility and overall digestive health in various clinical settings. However, the blurred definition of bowel sounds and their subtypes, limited resources for interpretation, poor sensitivity, and low positive predictive value led to their restricted utility. Recent advances in artificial intelligence and machine learning have steered interest in developing unique tools using the phonoenterogram to analyze diverse bowel sounds. In our study, bowel sounds were recorded from eight healthy volunteers using the Eko Duo stethoscope. A novel deep-learning algorithm was designed to classify the recordings into baseline or prominent bowel sounds. A total of 11,210 data points (5,605 balanced sounds) were used to train and test the model to yield an accuracy of 0.895, a precision of 0.890, and a recall of 0.854 reflecting successful segregation of these sounds into respective groups. More extensive studies enrolling healthy and diseased subjects with a device specifically tailored to record bowel sounds are needed to generalize these results and determine their application in the patient population.