{"title":"解码揭示早期视觉区域的视觉工作记忆内容","authors":"Stephenie A. Harrison, Frank Tong","doi":"10.1038/nature07832","DOIUrl":null,"url":null,"abstract":"Although we can hold several different items in working visual memory, how we remember specific details and visual features of individual objects remains a mystery. The neurons in the higher-order areas responsible for working memory seem to exhibit no selectivity for visual detail, and the early visual areas of the cerebral cortex are uniquely able to process incoming visual signals from the eye but, it was thought, not to perform higher cognitive functions such as memory. Using a new technique for decoding data from functional magnetic resonance imaging (fMRI), Stephanie Harrison and Frank Tong have found that early visual areas can retain specific information about features held in working memory. Volunteers were shown two striped patterns at different orientations and asked to memorize one of the orientations whilst being scanned by fMRI. From analysis of the scans it was possible to predict which of the two orientation patterns a subject was being retained in over 80% of tests. This study shows that early visual areas can retain specific information about features held in working memory even when there is no physical stimulus present. Using functional magnetic resonance imaging decoding methods, visual features could be predicted from early visual area activity with a high degree of accuracy. Visual working memory provides an essential link between perception and higher cognitive functions, allowing for the active maintenance of information about stimuli no longer in view1,2. Research suggests that sustained activity in higher-order prefrontal, parietal, inferotemporal and lateral occipital areas supports visual maintenance3,4,5,6,7,8,9,10,11, and may account for the limited capacity of working memory to hold up to 3–4 items9,10,11. Because higher-order areas lack the visual selectivity of early sensory areas, it has remained unclear how observers can remember specific visual features, such as the precise orientation of a grating, with minimal decay in performance over delays of many seconds12. One proposal is that sensory areas serve to maintain fine-tuned feature information13, but early visual areas show little to no sustained activity over prolonged delays14,15,16. Here we show that orientations held in working memory can be decoded from activity patterns in the human visual cortex, even when overall levels of activity are low. Using functional magnetic resonance imaging and pattern classification methods, we found that activity patterns in visual areas V1–V4 could predict which of two oriented gratings was held in memory with mean accuracy levels upwards of 80%, even in participants whose activity fell to baseline levels after a prolonged delay. These orientation-selective activity patterns were sustained throughout the delay period, evident in individual visual areas, and similar to the responses evoked by unattended, task-irrelevant gratings. Our results demonstrate that early visual areas can retain specific information about visual features held in working memory, over periods of many seconds when no physical stimulus is present.","PeriodicalId":50,"journal":{"name":"Langmuir","volume":null,"pages":null},"PeriodicalIF":3.7000,"publicationDate":"2009-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1038/nature07832","citationCount":"1093","resultStr":"{\"title\":\"Decoding reveals the contents of visual working memory in early visual areas\",\"authors\":\"Stephenie A. Harrison, Frank Tong\",\"doi\":\"10.1038/nature07832\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Although we can hold several different items in working visual memory, how we remember specific details and visual features of individual objects remains a mystery. The neurons in the higher-order areas responsible for working memory seem to exhibit no selectivity for visual detail, and the early visual areas of the cerebral cortex are uniquely able to process incoming visual signals from the eye but, it was thought, not to perform higher cognitive functions such as memory. Using a new technique for decoding data from functional magnetic resonance imaging (fMRI), Stephanie Harrison and Frank Tong have found that early visual areas can retain specific information about features held in working memory. Volunteers were shown two striped patterns at different orientations and asked to memorize one of the orientations whilst being scanned by fMRI. From analysis of the scans it was possible to predict which of the two orientation patterns a subject was being retained in over 80% of tests. This study shows that early visual areas can retain specific information about features held in working memory even when there is no physical stimulus present. Using functional magnetic resonance imaging decoding methods, visual features could be predicted from early visual area activity with a high degree of accuracy. Visual working memory provides an essential link between perception and higher cognitive functions, allowing for the active maintenance of information about stimuli no longer in view1,2. Research suggests that sustained activity in higher-order prefrontal, parietal, inferotemporal and lateral occipital areas supports visual maintenance3,4,5,6,7,8,9,10,11, and may account for the limited capacity of working memory to hold up to 3–4 items9,10,11. Because higher-order areas lack the visual selectivity of early sensory areas, it has remained unclear how observers can remember specific visual features, such as the precise orientation of a grating, with minimal decay in performance over delays of many seconds12. One proposal is that sensory areas serve to maintain fine-tuned feature information13, but early visual areas show little to no sustained activity over prolonged delays14,15,16. Here we show that orientations held in working memory can be decoded from activity patterns in the human visual cortex, even when overall levels of activity are low. Using functional magnetic resonance imaging and pattern classification methods, we found that activity patterns in visual areas V1–V4 could predict which of two oriented gratings was held in memory with mean accuracy levels upwards of 80%, even in participants whose activity fell to baseline levels after a prolonged delay. These orientation-selective activity patterns were sustained throughout the delay period, evident in individual visual areas, and similar to the responses evoked by unattended, task-irrelevant gratings. Our results demonstrate that early visual areas can retain specific information about visual features held in working memory, over periods of many seconds when no physical stimulus is present.\",\"PeriodicalId\":50,\"journal\":{\"name\":\"Langmuir\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2009-02-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1038/nature07832\",\"citationCount\":\"1093\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Langmuir\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://www.nature.com/articles/nature07832\",\"RegionNum\":2,\"RegionCategory\":\"化学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CHEMISTRY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Langmuir","FirstCategoryId":"103","ListUrlMain":"https://www.nature.com/articles/nature07832","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
Decoding reveals the contents of visual working memory in early visual areas
Although we can hold several different items in working visual memory, how we remember specific details and visual features of individual objects remains a mystery. The neurons in the higher-order areas responsible for working memory seem to exhibit no selectivity for visual detail, and the early visual areas of the cerebral cortex are uniquely able to process incoming visual signals from the eye but, it was thought, not to perform higher cognitive functions such as memory. Using a new technique for decoding data from functional magnetic resonance imaging (fMRI), Stephanie Harrison and Frank Tong have found that early visual areas can retain specific information about features held in working memory. Volunteers were shown two striped patterns at different orientations and asked to memorize one of the orientations whilst being scanned by fMRI. From analysis of the scans it was possible to predict which of the two orientation patterns a subject was being retained in over 80% of tests. This study shows that early visual areas can retain specific information about features held in working memory even when there is no physical stimulus present. Using functional magnetic resonance imaging decoding methods, visual features could be predicted from early visual area activity with a high degree of accuracy. Visual working memory provides an essential link between perception and higher cognitive functions, allowing for the active maintenance of information about stimuli no longer in view1,2. Research suggests that sustained activity in higher-order prefrontal, parietal, inferotemporal and lateral occipital areas supports visual maintenance3,4,5,6,7,8,9,10,11, and may account for the limited capacity of working memory to hold up to 3–4 items9,10,11. Because higher-order areas lack the visual selectivity of early sensory areas, it has remained unclear how observers can remember specific visual features, such as the precise orientation of a grating, with minimal decay in performance over delays of many seconds12. One proposal is that sensory areas serve to maintain fine-tuned feature information13, but early visual areas show little to no sustained activity over prolonged delays14,15,16. Here we show that orientations held in working memory can be decoded from activity patterns in the human visual cortex, even when overall levels of activity are low. Using functional magnetic resonance imaging and pattern classification methods, we found that activity patterns in visual areas V1–V4 could predict which of two oriented gratings was held in memory with mean accuracy levels upwards of 80%, even in participants whose activity fell to baseline levels after a prolonged delay. These orientation-selective activity patterns were sustained throughout the delay period, evident in individual visual areas, and similar to the responses evoked by unattended, task-irrelevant gratings. Our results demonstrate that early visual areas can retain specific information about visual features held in working memory, over periods of many seconds when no physical stimulus is present.
期刊介绍:
Langmuir is an interdisciplinary journal publishing articles in the following subject categories:
Colloids: surfactants and self-assembly, dispersions, emulsions, foams
Interfaces: adsorption, reactions, films, forces
Biological Interfaces: biocolloids, biomolecular and biomimetic materials
Materials: nano- and mesostructured materials, polymers, gels, liquid crystals
Electrochemistry: interfacial charge transfer, charge transport, electrocatalysis, electrokinetic phenomena, bioelectrochemistry
Devices and Applications: sensors, fluidics, patterning, catalysis, photonic crystals
However, when high-impact, original work is submitted that does not fit within the above categories, decisions to accept or decline such papers will be based on one criteria: What Would Irving Do?
Langmuir ranks #2 in citations out of 136 journals in the category of Physical Chemistry with 113,157 total citations. The journal received an Impact Factor of 4.384*.
This journal is also indexed in the categories of Materials Science (ranked #1) and Multidisciplinary Chemistry (ranked #5).