Silvio Kalaj , Clarissa Lauditi , Gabriele Perugini , Carlo Lucibello , Enrico M. Malatesta , Matteo Negri
{"title":"Random features Hopfield networks generalize retrieval to previously unseen examples","authors":"Silvio Kalaj , Clarissa Lauditi , Gabriele Perugini , Carlo Lucibello , Enrico M. Malatesta , Matteo Negri","doi":"10.1016/j.physa.2025.130946","DOIUrl":null,"url":null,"abstract":"<div><div>It has been recently shown that a feature-learning transition happens when a Hopfield Network stores examples generated as superpositions of random features, where new attractors corresponding to such features appear in the model. In this work we reveal that the network also develops attractors corresponding to previously unseen examples generated as mixtures from the same set of features. We explain this surprising behavior in terms of spurious states of the learned features: increasing the number of stored examples beyond the feature-learning transition, the model also learns to mix the features to represent both stored and previously unseen examples. We support this claim by computing the phase diagram of the model and matching the numerical results with the spinodal lines of mixed spurious states.</div></div>","PeriodicalId":20152,"journal":{"name":"Physica A: Statistical Mechanics and its Applications","volume":"678 ","pages":"Article 130946"},"PeriodicalIF":3.1000,"publicationDate":"2025-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physica A: Statistical Mechanics and its Applications","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378437125005989","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
It has been recently shown that a feature-learning transition happens when a Hopfield Network stores examples generated as superpositions of random features, where new attractors corresponding to such features appear in the model. In this work we reveal that the network also develops attractors corresponding to previously unseen examples generated as mixtures from the same set of features. We explain this surprising behavior in terms of spurious states of the learned features: increasing the number of stored examples beyond the feature-learning transition, the model also learns to mix the features to represent both stored and previously unseen examples. We support this claim by computing the phase diagram of the model and matching the numerical results with the spinodal lines of mixed spurious states.
期刊介绍:
Physica A: Statistical Mechanics and its Applications
Recognized by the European Physical Society
Physica A publishes research in the field of statistical mechanics and its applications.
Statistical mechanics sets out to explain the behaviour of macroscopic systems by studying the statistical properties of their microscopic constituents.
Applications of the techniques of statistical mechanics are widespread, and include: applications to physical systems such as solids, liquids and gases; applications to chemical and biological systems (colloids, interfaces, complex fluids, polymers and biopolymers, cell physics); and other interdisciplinary applications to for instance biological, economical and sociological systems.