Taylor W Webb, Steven M Frankland, Awni Altabaa, Simon Segert, Kamesh Krishnamurthy, Declan Campbell, Jacob Russin, Tyler Giallanza, Randall O'Reilly, John Lafferty, Jonathan D Cohen
{"title":"The relational bottleneck as an inductive bias for efficient abstraction.","authors":"Taylor W Webb, Steven M Frankland, Awni Altabaa, Simon Segert, Kamesh Krishnamurthy, Declan Campbell, Jacob Russin, Tyler Giallanza, Randall O'Reilly, John Lafferty, Jonathan D Cohen","doi":"10.1016/j.tics.2024.04.001","DOIUrl":null,"url":null,"abstract":"<p><p>A central challenge for cognitive science is to explain how abstract concepts are acquired from limited experience. This has often been framed in terms of a dichotomy between connectionist and symbolic cognitive models. Here, we highlight a recently emerging line of work that suggests a novel reconciliation of these approaches, by exploiting an inductive bias that we term the relational bottleneck. In that approach, neural networks are constrained via their architecture to focus on relations between perceptual inputs, rather than the attributes of individual inputs. We review a family of models that employ this approach to induce abstractions in a data-efficient manner, emphasizing their potential as candidate models for the acquisition of abstract concepts in the human mind and brain.</p>","PeriodicalId":49417,"journal":{"name":"Trends in Cognitive Sciences","volume":" ","pages":"829-843"},"PeriodicalIF":16.7000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Trends in Cognitive Sciences","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1016/j.tics.2024.04.001","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/5/9 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"BEHAVIORAL SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
A central challenge for cognitive science is to explain how abstract concepts are acquired from limited experience. This has often been framed in terms of a dichotomy between connectionist and symbolic cognitive models. Here, we highlight a recently emerging line of work that suggests a novel reconciliation of these approaches, by exploiting an inductive bias that we term the relational bottleneck. In that approach, neural networks are constrained via their architecture to focus on relations between perceptual inputs, rather than the attributes of individual inputs. We review a family of models that employ this approach to induce abstractions in a data-efficient manner, emphasizing their potential as candidate models for the acquisition of abstract concepts in the human mind and brain.
期刊介绍:
Essential reading for those working directly in the cognitive sciences or in related specialist areas, Trends in Cognitive Sciences provides an instant overview of current thinking for scientists, students and teachers who want to keep up with the latest developments in the cognitive sciences. The journal brings together research in psychology, artificial intelligence, linguistics, philosophy, computer science and neuroscience. Trends in Cognitive Sciences provides a platform for the interaction of these disciplines and the evolution of cognitive science as an independent field of study.