{"title":"基于视觉皮层感受野移位的注意位置解码","authors":"Xiaohan Duan, Ziya Yu, Li Tong, Linyuan Wang","doi":"10.1109/ICBCB.2019.8854642","DOIUrl":null,"url":null,"abstract":"Visual attention is an important issue in the field of neuroscience and computer vision. According to recent research of visual cognitive computation, receptive fields are thought to be shifted with the influence of spatial attention. In the traditional method, researchers decoded various positions of attention based on constant population receptive field (pRF) parameters. Comparing with previous attention decoding researches, recent discovery may help improve the decoding accuracy. In this research, to get a better accuracy, a new decoding method is proposed with introducing the shift of pRF parameters. Firstly, we adopted two-dimensional Gaussian receptive field model to characterize the population receptive field(pRF) of each voxel in seven visual areas [V1-V4, inferior occipital gyrus (IOG), posterior fusiform gyrus (pFus), and mid-fusiform gyrus (mFus)]. Then, we introduced a parameter to measure the shift of pRF. With the shifted pRF parameters, the attention position could be decoded by maximum likelihood estimation. With published fMRI dataset, a better decoding accuracy could be obtained in most regions, especially in higher regions. The result also indicated that with the modulation of spatial attention, pRF parameters of voxels in high regions were shifted much more than those in early regions.","PeriodicalId":136995,"journal":{"name":"2019 IEEE 7th International Conference on Bioinformatics and Computational Biology ( ICBCB)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Decoding Attention Position Based on Shifted Receptive Field in Visual Cortex\",\"authors\":\"Xiaohan Duan, Ziya Yu, Li Tong, Linyuan Wang\",\"doi\":\"10.1109/ICBCB.2019.8854642\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Visual attention is an important issue in the field of neuroscience and computer vision. According to recent research of visual cognitive computation, receptive fields are thought to be shifted with the influence of spatial attention. In the traditional method, researchers decoded various positions of attention based on constant population receptive field (pRF) parameters. Comparing with previous attention decoding researches, recent discovery may help improve the decoding accuracy. In this research, to get a better accuracy, a new decoding method is proposed with introducing the shift of pRF parameters. Firstly, we adopted two-dimensional Gaussian receptive field model to characterize the population receptive field(pRF) of each voxel in seven visual areas [V1-V4, inferior occipital gyrus (IOG), posterior fusiform gyrus (pFus), and mid-fusiform gyrus (mFus)]. Then, we introduced a parameter to measure the shift of pRF. With the shifted pRF parameters, the attention position could be decoded by maximum likelihood estimation. With published fMRI dataset, a better decoding accuracy could be obtained in most regions, especially in higher regions. The result also indicated that with the modulation of spatial attention, pRF parameters of voxels in high regions were shifted much more than those in early regions.\",\"PeriodicalId\":136995,\"journal\":{\"name\":\"2019 IEEE 7th International Conference on Bioinformatics and Computational Biology ( ICBCB)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE 7th International Conference on Bioinformatics and Computational Biology ( ICBCB)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICBCB.2019.8854642\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 7th International Conference on Bioinformatics and Computational Biology ( ICBCB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBCB.2019.8854642","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Decoding Attention Position Based on Shifted Receptive Field in Visual Cortex
Visual attention is an important issue in the field of neuroscience and computer vision. According to recent research of visual cognitive computation, receptive fields are thought to be shifted with the influence of spatial attention. In the traditional method, researchers decoded various positions of attention based on constant population receptive field (pRF) parameters. Comparing with previous attention decoding researches, recent discovery may help improve the decoding accuracy. In this research, to get a better accuracy, a new decoding method is proposed with introducing the shift of pRF parameters. Firstly, we adopted two-dimensional Gaussian receptive field model to characterize the population receptive field(pRF) of each voxel in seven visual areas [V1-V4, inferior occipital gyrus (IOG), posterior fusiform gyrus (pFus), and mid-fusiform gyrus (mFus)]. Then, we introduced a parameter to measure the shift of pRF. With the shifted pRF parameters, the attention position could be decoded by maximum likelihood estimation. With published fMRI dataset, a better decoding accuracy could be obtained in most regions, especially in higher regions. The result also indicated that with the modulation of spatial attention, pRF parameters of voxels in high regions were shifted much more than those in early regions.