基于视觉的人机共享控制智能轮椅机器人操作

Siyi Du, Fei Wang, Guilin Zhou, Jiaqi Li, Lintao Yang, Dongxu Wang
{"title":"基于视觉的人机共享控制智能轮椅机器人操作","authors":"Siyi Du, Fei Wang, Guilin Zhou, Jiaqi Li, Lintao Yang, Dongxu Wang","doi":"10.1109/CCDC52312.2021.9601850","DOIUrl":null,"url":null,"abstract":"Based on human-computer shared control, this paper introduces a novel robotic manipulation fashion combining computer vision and brain-computer interface (BCI). Designed for disabled groups, the intelligent wheelchair with our proposed method exhibits the precise robotic manipulation ability but also the human decision-making capabilities, which will bring better life quality for the disabled. The overall pipeline includes three parts: asynchronous brain-computer interface based on steady-state visual evoked potential (SSVEP), vision detection with deep network and robotic manipulation of UR5 robot. Particularly, first, the user receives the periodic visual stimulation with different frequencies and then electroencephalography (EEG) signals of the user are collected by EEG cap. Second, we preprocess the EEG signals and extract feature embedding. To judge the frequency of the stimulus signals received by the user, the canonical correlation analysis (CCA) algorithm is used to fit and compare it with the standard EEG signal. In our work, the signals with different frequencies corresponds to different types of objects item by item. Third, we apply the off-the-shelfvision detection algorithm, Mask-RCNN, to output the position of the object corresponding to the detected EEG in the image frame. UR5 robot arm plan manipulation path according to the position of objects transferred by robot operating system (ROS). Extensive experiments show that our method can achieves performance with more than 90% accuracy and the user can control the robot arm to grab the expected object accurately through BCI.","PeriodicalId":143976,"journal":{"name":"2021 33rd Chinese Control and Decision Conference (CCDC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Vision-Based Robotic Manipulation of Intelligent Wheelchair with Human-Computer Shared Control\",\"authors\":\"Siyi Du, Fei Wang, Guilin Zhou, Jiaqi Li, Lintao Yang, Dongxu Wang\",\"doi\":\"10.1109/CCDC52312.2021.9601850\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Based on human-computer shared control, this paper introduces a novel robotic manipulation fashion combining computer vision and brain-computer interface (BCI). Designed for disabled groups, the intelligent wheelchair with our proposed method exhibits the precise robotic manipulation ability but also the human decision-making capabilities, which will bring better life quality for the disabled. The overall pipeline includes three parts: asynchronous brain-computer interface based on steady-state visual evoked potential (SSVEP), vision detection with deep network and robotic manipulation of UR5 robot. Particularly, first, the user receives the periodic visual stimulation with different frequencies and then electroencephalography (EEG) signals of the user are collected by EEG cap. Second, we preprocess the EEG signals and extract feature embedding. To judge the frequency of the stimulus signals received by the user, the canonical correlation analysis (CCA) algorithm is used to fit and compare it with the standard EEG signal. In our work, the signals with different frequencies corresponds to different types of objects item by item. Third, we apply the off-the-shelfvision detection algorithm, Mask-RCNN, to output the position of the object corresponding to the detected EEG in the image frame. UR5 robot arm plan manipulation path according to the position of objects transferred by robot operating system (ROS). Extensive experiments show that our method can achieves performance with more than 90% accuracy and the user can control the robot arm to grab the expected object accurately through BCI.\",\"PeriodicalId\":143976,\"journal\":{\"name\":\"2021 33rd Chinese Control and Decision Conference (CCDC)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-05-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 33rd Chinese Control and Decision Conference (CCDC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCDC52312.2021.9601850\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 33rd Chinese Control and Decision Conference (CCDC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCDC52312.2021.9601850","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

基于人机共享控制,提出了一种将计算机视觉与脑机接口(BCI)相结合的新型机器人操作方式。针对残疾人群体设计的智能轮椅,既具有精确的机器人操作能力,又具有人性化的决策能力,将为残疾人带来更好的生活质量。整个流程包括三个部分:基于稳态视觉诱发电位(SSVEP)的异步脑机接口、基于深度网络的视觉检测和UR5机器人的机器人操作。具体而言,首先对用户进行不同频率的周期性视觉刺激,然后利用EEG帽采集用户的脑电图信号。其次,对脑电图信号进行预处理,提取特征嵌入。为了判断用户接收到的刺激信号的频率,采用典型相关分析(CCA)算法对其与标准脑电信号进行拟合和比较。在我们的工作中,不同频率的信号逐项对应不同类型的物体。第三,我们使用现成的视觉检测算法Mask-RCNN,输出检测到的EEG对应的物体在图像帧中的位置。UR5机器人手臂根据机器人操作系统(ROS)传递的物体位置规划操作路径。大量的实验表明,我们的方法可以达到90%以上的精度,用户可以通过BCI控制机器人手臂准确地抓取期望的物体。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Vision-Based Robotic Manipulation of Intelligent Wheelchair with Human-Computer Shared Control
Based on human-computer shared control, this paper introduces a novel robotic manipulation fashion combining computer vision and brain-computer interface (BCI). Designed for disabled groups, the intelligent wheelchair with our proposed method exhibits the precise robotic manipulation ability but also the human decision-making capabilities, which will bring better life quality for the disabled. The overall pipeline includes three parts: asynchronous brain-computer interface based on steady-state visual evoked potential (SSVEP), vision detection with deep network and robotic manipulation of UR5 robot. Particularly, first, the user receives the periodic visual stimulation with different frequencies and then electroencephalography (EEG) signals of the user are collected by EEG cap. Second, we preprocess the EEG signals and extract feature embedding. To judge the frequency of the stimulus signals received by the user, the canonical correlation analysis (CCA) algorithm is used to fit and compare it with the standard EEG signal. In our work, the signals with different frequencies corresponds to different types of objects item by item. Third, we apply the off-the-shelfvision detection algorithm, Mask-RCNN, to output the position of the object corresponding to the detected EEG in the image frame. UR5 robot arm plan manipulation path according to the position of objects transferred by robot operating system (ROS). Extensive experiments show that our method can achieves performance with more than 90% accuracy and the user can control the robot arm to grab the expected object accurately through BCI.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信