How We See and Recognize Object Motion

S. Grossberg
{"title":"How We See and Recognize Object Motion","authors":"S. Grossberg","doi":"10.1093/oso/9780190070557.003.0008","DOIUrl":null,"url":null,"abstract":"This chapter explains why visual motion perception is not just perception of the changing positions of moving objects. Computationally complementary processes process static objects with different orientations, and moving objects with different motion directions, via parallel cortical form and motion streams through V2 and MT. The motion stream pools multiple oriented object contours to estimate object motion direction. Such pooling coarsens estimates of object depth, which require precise matches of oriented stimuli from both eyes. Negative aftereffects of form and motion stimuli illustrate these complementary properties. Feature tracking signals begin to overcome directional ambiguities due to the aperture problem. Motion capture by short-range and long-range directional filters, together with competitive interactions, process feature tracking and ambiguous motion directional signals to generate a coherent representation of object motion direction and speed. Many properties of motion perception are explained, notably barberpole illusion and properties of long-range apparent motion, including how apparent motion speed varies with flash interstimulus interval, distance, and luminance; apparent motion of illusory contours; phi and beta motion; split motion; gamma motion; Ternus motion; Korte’s Laws; line motion illusion; induced motion; motion transparency; chopsticks illusion; Johannson motion; and Duncker motion. Gaussian waves of apparent motion clarify how tracking occurs, and explain spatial attention shifts through time. This motion processor helps to quantitatively simulate neurophysiological data about motion-based decision-making in monkeys when it inputs to a model of how the lateral intraparietal, or LIP, area chooses a movement direction from the motion direction estimate. Bayesian decision-making models cannot explain these data.","PeriodicalId":370230,"journal":{"name":"Conscious Mind, Resonant Brain","volume":"47 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conscious Mind, Resonant Brain","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780190070557.003.0008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This chapter explains why visual motion perception is not just perception of the changing positions of moving objects. Computationally complementary processes process static objects with different orientations, and moving objects with different motion directions, via parallel cortical form and motion streams through V2 and MT. The motion stream pools multiple oriented object contours to estimate object motion direction. Such pooling coarsens estimates of object depth, which require precise matches of oriented stimuli from both eyes. Negative aftereffects of form and motion stimuli illustrate these complementary properties. Feature tracking signals begin to overcome directional ambiguities due to the aperture problem. Motion capture by short-range and long-range directional filters, together with competitive interactions, process feature tracking and ambiguous motion directional signals to generate a coherent representation of object motion direction and speed. Many properties of motion perception are explained, notably barberpole illusion and properties of long-range apparent motion, including how apparent motion speed varies with flash interstimulus interval, distance, and luminance; apparent motion of illusory contours; phi and beta motion; split motion; gamma motion; Ternus motion; Korte’s Laws; line motion illusion; induced motion; motion transparency; chopsticks illusion; Johannson motion; and Duncker motion. Gaussian waves of apparent motion clarify how tracking occurs, and explain spatial attention shifts through time. This motion processor helps to quantitatively simulate neurophysiological data about motion-based decision-making in monkeys when it inputs to a model of how the lateral intraparietal, or LIP, area chooses a movement direction from the motion direction estimate. Bayesian decision-making models cannot explain these data.
我们如何看到和识别物体运动
本章解释了为什么视觉运动感知不仅仅是对运动物体位置变化的感知。计算互补过程通过平行皮质形式和V2和MT的运动流来处理不同方向的静态物体和不同运动方向的运动物体。运动流汇集多个定向物体轮廓来估计物体的运动方向。这样的集合使得物体深度的估计变得粗糙,这需要两只眼睛的定向刺激的精确匹配。形式和运动刺激的负后效说明了这些互补的特性。由于孔径问题,特征跟踪信号开始克服方向模糊。运动捕捉通过短距离和远距离的方向滤波器,连同竞争性的相互作用,过程特征跟踪和模糊的运动方向信号,以产生物体运动方向和速度的连贯表示。解释了运动知觉的许多特性,特别是barberpole错觉和远距离视运动的特性,包括视运动速度如何随闪烁间刺激间隔、距离和亮度变化;虚幻轮廓的明显运动;和运动;分裂运动;γ运动;Ternus运动;科特的法律;线运动错觉;诱导运动;运动的透明度;筷子错觉;约翰森运动;和邓克尔动议。表观运动的高斯波阐明了跟踪是如何发生的,并解释了空间注意力随时间的变化。这种运动处理器有助于定量模拟猴子基于运动决策的神经生理数据,当它输入一个模型,说明外侧顶叶内区域(LIP)如何从运动方向估计中选择运动方向。贝叶斯决策模型无法解释这些数据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信