Next-Generation Tactile Sensing and Machine Learning Integration for Robot-Assisted Minimally Invasive Surgery.

IF 4.5 2区 医学 Q2 ENGINEERING, BIOMEDICAL
Dema N Govalla, Anish S Naidu, Dhrubo Ahmad, Jerzy W Rozenblit
{"title":"Next-Generation Tactile Sensing and Machine Learning Integration for Robot-Assisted Minimally Invasive Surgery.","authors":"Dema N Govalla, Anish S Naidu, Dhrubo Ahmad, Jerzy W Rozenblit","doi":"10.1109/TBME.2025.3613757","DOIUrl":null,"url":null,"abstract":"<p><p>Tactile feedback in robot-assisted minimally invasive surgery (RAMIS) is crucial for surgeons when palpating subsurface tumors and other organ structures. The research presented here is a new approach for tactile sensation generation that aims to provide deformation and texture detection in RAMIS. The proposed solution comprises three phases: feature extraction, recognition and feedback. The feature extraction process is based on data acquisition from two micro-electromechanical systems (MEMS) sensors and a force-sensitive resistor (FSR) sensor attached to an EndoWrist thoracic grasper instrument compatible with the da Vinci Surgical System. The acquired data is processed using digital signal processing methods and utilized in the recognition phase. The recognition segment receives the features as inputs for training and testing two advanced machine learning algorithms. The first algorithm is a Reflex Fuzzy Min-Max Neural Network (RFMN); the other is a Time Series Classification - Learning Shapelets (TSC-LS) method. The machine learning algorithms aim to accurately recognize and classify physiological structures with different softness and roughness into a corresponding deformation or texture label. Lastly, a means of mechanically giving the labeled data as feedback to the surgeon via a visual-tactile display and a wearable device located on the surgeon's forearm is accomplished to mimic palpation feedback during RAMIS.</p>","PeriodicalId":13245,"journal":{"name":"IEEE Transactions on Biomedical Engineering","volume":"PP ","pages":"1718-1733"},"PeriodicalIF":4.5000,"publicationDate":"2026-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Biomedical Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/TBME.2025.3613757","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Tactile feedback in robot-assisted minimally invasive surgery (RAMIS) is crucial for surgeons when palpating subsurface tumors and other organ structures. The research presented here is a new approach for tactile sensation generation that aims to provide deformation and texture detection in RAMIS. The proposed solution comprises three phases: feature extraction, recognition and feedback. The feature extraction process is based on data acquisition from two micro-electromechanical systems (MEMS) sensors and a force-sensitive resistor (FSR) sensor attached to an EndoWrist thoracic grasper instrument compatible with the da Vinci Surgical System. The acquired data is processed using digital signal processing methods and utilized in the recognition phase. The recognition segment receives the features as inputs for training and testing two advanced machine learning algorithms. The first algorithm is a Reflex Fuzzy Min-Max Neural Network (RFMN); the other is a Time Series Classification - Learning Shapelets (TSC-LS) method. The machine learning algorithms aim to accurately recognize and classify physiological structures with different softness and roughness into a corresponding deformation or texture label. Lastly, a means of mechanically giving the labeled data as feedback to the surgeon via a visual-tactile display and a wearable device located on the surgeon's forearm is accomplished to mimic palpation feedback during RAMIS.

用于机器人辅助微创手术的下一代触觉传感和机器学习集成。
在机器人辅助微创手术(RAMIS)中,触觉反馈对外科医生触诊表面下肿瘤和其他器官结构至关重要。本文研究的是一种新的触觉产生方法,旨在为RAMIS提供变形和纹理检测。该方案包括特征提取、识别和反馈三个阶段。特征提取过程基于两个微机电系统(MEMS)传感器和一个力敏电阻(FSR)传感器的数据采集,该传感器连接在与达芬奇手术系统兼容的EndoWrist胸廓抓握器上。采用数字信号处理方法对采集到的数据进行处理,并用于识别阶段。识别段接收特征作为训练和测试两种高级机器学习算法的输入。第一种算法是反射模糊最小-最大神经网络(RFMN);另一种是时间序列分类-学习Shapelets (TSC-LS)方法。机器学习算法旨在准确识别和分类具有不同柔软度和粗糙度的生理结构到相应的变形或纹理标签。最后,通过视觉触觉显示器和外科医生前臂上的可穿戴设备,机械地将标记数据作为反馈提供给外科医生,以模拟RAMIS期间的触诊反馈。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Biomedical Engineering
IEEE Transactions on Biomedical Engineering 工程技术-工程:生物医学
CiteScore
9.40
自引率
4.30%
发文量
880
审稿时长
2.5 months
期刊介绍: IEEE Transactions on Biomedical Engineering contains basic and applied papers dealing with biomedical engineering. Papers range from engineering development in methods and techniques with biomedical applications to experimental and clinical investigations with engineering contributions.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书