A probabilistic neural network-based bimanual control method with multimodal haptic perception fusion

IF 6.2 2区 工程技术 Q1 ENGINEERING, MULTIDISCIPLINARY
Xinrui Chi, Zhanbin Guo, Fu Cheng
{"title":"A probabilistic neural network-based bimanual control method with multimodal haptic perception fusion","authors":"Xinrui Chi,&nbsp;Zhanbin Guo,&nbsp;Fu Cheng","doi":"10.1016/j.aej.2025.06.024","DOIUrl":null,"url":null,"abstract":"<div><div>In the master-slave robot system, single-modal tactile perception has problems such as collision detection delay (&gt;120 ms), force estimation error (&gt;2.3 N), and sensor conflicts, resulting in a 37 % failure rate of robot operations in nuclear decommissioning scenarios and a 19.2 % risk of excessive tissue compression in laparoscopic surgery. To address this, this paper proposes a multimodal tactile perception fusion control method based on a probabilistic neural network (PNN). Pressure, vibration, and temperature signals are synchronously collected through bionic artificial skin. A hierarchical heterogeneous feature alignment (HHFA) module is designed to solve the spatio-temporal asynchrony of multi-source signals (root mean square error &lt;0.8 ms), and a dynamic Bayesian fusion layer (DBFL) is developed to achieve adaptive weighting based on the entropy-variance coupling index, suppressing noise interference and modal conflicts. The dual-channel PNN encodes the fused sensory information into a Gaussian mixture model (8 components) and generates high-precision control instructions by maximizing the posterior probability. Experiments show that in grasping and fine operation tasks, the positioning error of this method is reduced to 0.15 mm, the operation success rate is increased by 19.6 % (reaching 96.4 %), and the signal-to-noise ratio remains stable at <span><math><mrow><mn>40.2</mn><mo>±</mo><mn>1.5</mn><mi>dB</mi></mrow></math></span> under humidity changes (30–90 %RH) and mechanical strain (15 %).</div></div>","PeriodicalId":7484,"journal":{"name":"alexandria engineering journal","volume":"127 ","pages":"Pages 892-919"},"PeriodicalIF":6.2000,"publicationDate":"2025-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"alexandria engineering journal","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1110016825007653","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

In the master-slave robot system, single-modal tactile perception has problems such as collision detection delay (>120 ms), force estimation error (>2.3 N), and sensor conflicts, resulting in a 37 % failure rate of robot operations in nuclear decommissioning scenarios and a 19.2 % risk of excessive tissue compression in laparoscopic surgery. To address this, this paper proposes a multimodal tactile perception fusion control method based on a probabilistic neural network (PNN). Pressure, vibration, and temperature signals are synchronously collected through bionic artificial skin. A hierarchical heterogeneous feature alignment (HHFA) module is designed to solve the spatio-temporal asynchrony of multi-source signals (root mean square error <0.8 ms), and a dynamic Bayesian fusion layer (DBFL) is developed to achieve adaptive weighting based on the entropy-variance coupling index, suppressing noise interference and modal conflicts. The dual-channel PNN encodes the fused sensory information into a Gaussian mixture model (8 components) and generates high-precision control instructions by maximizing the posterior probability. Experiments show that in grasping and fine operation tasks, the positioning error of this method is reduced to 0.15 mm, the operation success rate is increased by 19.6 % (reaching 96.4 %), and the signal-to-noise ratio remains stable at 40.2±1.5dB under humidity changes (30–90 %RH) and mechanical strain (15 %).
一种基于概率神经网络的多模态触觉融合的双手控制方法
在主从机器人系统中,单模态触觉感知存在碰撞检测延迟(>120 ms)、力估计误差(>2.3 N)和传感器冲突等问题,导致机器人在核退役场景中的操作失败率为37% %,在腹腔镜手术中过度压迫组织的风险为19.2% %。针对这一问题,提出了一种基于概率神经网络(PNN)的多模态触觉感知融合控制方法。通过仿生人造皮肤同步收集压力、振动和温度信号。设计了分层异构特征对齐(HHFA)模块,解决了多源信号的时空异步性(均方根误差<;0.8 ms),开发了动态贝叶斯融合层(DBFL),实现了基于熵方差耦合指标的自适应加权,抑制了噪声干扰和模态冲突。双通道PNN将融合的感官信息编码成一个高斯混合模型(8个分量),并通过最大化后验概率生成高精度的控制指令。实验表明,在抓取和精细操作任务中,该方法的定位误差降低到0.15 mm,操作成功率提高19.6 %(达到96.4 %),在湿度变化(30 - 90% RH)和机械应变(15%)条件下,信噪比稳定在40.2±1.5dB。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
alexandria engineering journal
alexandria engineering journal Engineering-General Engineering
CiteScore
11.20
自引率
4.40%
发文量
1015
审稿时长
43 days
期刊介绍: Alexandria Engineering Journal is an international journal devoted to publishing high quality papers in the field of engineering and applied science. Alexandria Engineering Journal is cited in the Engineering Information Services (EIS) and the Chemical Abstracts (CA). The papers published in Alexandria Engineering Journal are grouped into five sections, according to the following classification: • Mechanical, Production, Marine and Textile Engineering • Electrical Engineering, Computer Science and Nuclear Engineering • Civil and Architecture Engineering • Chemical Engineering and Applied Sciences • Environmental Engineering
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信