Body area segmentation from visual scene based on predictability of neuro-dynamical system

H. Nobuta, Kenta Kawamoto, K. Noda, K. Sabe, S. Nishide, HIroshi G. Okuno, T. Ogata
{"title":"Body area segmentation from visual scene based on predictability of neuro-dynamical system","authors":"H. Nobuta, Kenta Kawamoto, K. Noda, K. Sabe, S. Nishide, HIroshi G. Okuno, T. Ogata","doi":"10.1109/IJCNN.2012.6252530","DOIUrl":null,"url":null,"abstract":"We propose neural models for segmenting the area of a body from visual scene based on predictability. Neuroscience has shown that a prediction model in brain, which predicts sensory-feedback from motor command, can divide the sensory-feedback into the self-motion derived feedback and other derived feedback. The prediction model is important for prediction control of the body. Previous studies in robotics of the prediction model assumed that a robot can recognize the position of its body (e.g. its hand) and that the view contains only that body part. In our models, motor commands and visual feedback (pixel image that includes not only a hand but also object and background) are input into a neural network model and then the body area is segmented and prediction model of body is acquired. Our model contains two parts: 1) An object detection model obtains a conversion system between object positions and the pixel image. 2) A movement prediction model predicts hand-object positions from motor commands and identifies the body. We confirmed that our models can segment the body/object area based on their pixel textures and discriminate between them by using prediction error.","PeriodicalId":287844,"journal":{"name":"The 2012 International Joint Conference on Neural Networks (IJCNN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2012 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2012.6252530","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We propose neural models for segmenting the area of a body from visual scene based on predictability. Neuroscience has shown that a prediction model in brain, which predicts sensory-feedback from motor command, can divide the sensory-feedback into the self-motion derived feedback and other derived feedback. The prediction model is important for prediction control of the body. Previous studies in robotics of the prediction model assumed that a robot can recognize the position of its body (e.g. its hand) and that the view contains only that body part. In our models, motor commands and visual feedback (pixel image that includes not only a hand but also object and background) are input into a neural network model and then the body area is segmented and prediction model of body is acquired. Our model contains two parts: 1) An object detection model obtains a conversion system between object positions and the pixel image. 2) A movement prediction model predicts hand-object positions from motor commands and identifies the body. We confirmed that our models can segment the body/object area based on their pixel textures and discriminate between them by using prediction error.
基于神经动力系统可预测性的视觉场景体区域分割
我们提出了基于可预测性从视觉场景中分割身体区域的神经模型。神经科学研究表明,大脑中有一种预测运动指令的感觉反馈的预测模型,可以将感觉反馈分为自运动衍生反馈和其他衍生反馈。预测模型对人体的预测控制具有重要意义。先前机器人预测模型的研究假设机器人可以识别其身体(例如手)的位置,并且视图仅包含该身体部分。在我们的模型中,将运动指令和视觉反馈(不仅包括手,还包括物体和背景的像素图像)输入到神经网络模型中,然后对身体区域进行分割并获得身体的预测模型。我们的模型包括两部分:1)目标检测模型获得目标位置与像素图像之间的转换系统。2)运动预测模型根据运动指令预测手-物体位置并识别身体。我们证实了我们的模型可以基于像素纹理分割身体/物体区域,并利用预测误差进行区分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信