Wentong Wang, Guangdong Zhan, Junyi Wei, Li Song, Lin Feng
{"title":"WCE with real time polyp detection and segmentation using deep neural networks","authors":"Wentong Wang, Guangdong Zhan, Junyi Wei, Li Song, Lin Feng","doi":"10.1109/WRCSARA53879.2021.9612694","DOIUrl":null,"url":null,"abstract":"Capsule robots are developing rapidly. The capsule robots perform non-invasive gastrointestinal endoscopy on patients. With the application of artificial intelligence in the capsule robot, the images collected by the capsule can be read efficiently and accurately offline. However, since the previous capsules either lacked wireless data transmission or only used Bluetooth data transmission, real-time interaction was not yet possible. It is very helpful for the diagnosis of the patient’s condition. However, due to the lack of real-time interaction, it is still only an early diagnosis and cannot be applied to real-time treatment. In other words, patients still need to undergo invasive surgery after the gastrointestinal capsule endoscopy. The capsule robot proposed in this paper transmits data through Wi-Fi, detects and locates the lesion in real-time using a deep neural network named SEG-YOLOv5. It provides real-time feedback of the endoscopy, which shows a potential of non-invasive treatment or surgery. The experimental result shows that the model detection mAP@0.5 of 10 fold cross-validation on the open-access dataset Kvasir-SEG reaches 97.77% and the dice coefficient is 85.03% for lesion segmentation.","PeriodicalId":246050,"journal":{"name":"2021 WRC Symposium on Advanced Robotics and Automation (WRC SARA)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 WRC Symposium on Advanced Robotics and Automation (WRC SARA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WRCSARA53879.2021.9612694","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Capsule robots are developing rapidly. The capsule robots perform non-invasive gastrointestinal endoscopy on patients. With the application of artificial intelligence in the capsule robot, the images collected by the capsule can be read efficiently and accurately offline. However, since the previous capsules either lacked wireless data transmission or only used Bluetooth data transmission, real-time interaction was not yet possible. It is very helpful for the diagnosis of the patient’s condition. However, due to the lack of real-time interaction, it is still only an early diagnosis and cannot be applied to real-time treatment. In other words, patients still need to undergo invasive surgery after the gastrointestinal capsule endoscopy. The capsule robot proposed in this paper transmits data through Wi-Fi, detects and locates the lesion in real-time using a deep neural network named SEG-YOLOv5. It provides real-time feedback of the endoscopy, which shows a potential of non-invasive treatment or surgery. The experimental result shows that the model detection mAP@0.5 of 10 fold cross-validation on the open-access dataset Kvasir-SEG reaches 97.77% and the dice coefficient is 85.03% for lesion segmentation.