Samson Alva, Eduardo Dueñez, Jose Iovino, Claire Walton
{"title":"深度均衡:存在性与可计算性","authors":"Samson Alva, Eduardo Dueñez, Jose Iovino, Claire Walton","doi":"arxiv-2409.06064","DOIUrl":null,"url":null,"abstract":"We introduce a general concept of layered computation model, of which neural\nnetworks are a particular example, and combine tools of topological dynamics\nand model theory to study asymptotics of such models. We prove that, as the\nnumber of layers of a computation grows, the computation reaches a state of\n``deep equilibrium\" which amounts to a single, self-referential layer. After\nproving the existence of deep equilibria under fairly general hypotheses, we\ncharacterize their computability.","PeriodicalId":501306,"journal":{"name":"arXiv - MATH - Logic","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep Equilibria: Existence and Computability\",\"authors\":\"Samson Alva, Eduardo Dueñez, Jose Iovino, Claire Walton\",\"doi\":\"arxiv-2409.06064\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We introduce a general concept of layered computation model, of which neural\\nnetworks are a particular example, and combine tools of topological dynamics\\nand model theory to study asymptotics of such models. We prove that, as the\\nnumber of layers of a computation grows, the computation reaches a state of\\n``deep equilibrium\\\" which amounts to a single, self-referential layer. After\\nproving the existence of deep equilibria under fairly general hypotheses, we\\ncharacterize their computability.\",\"PeriodicalId\":501306,\"journal\":{\"name\":\"arXiv - MATH - Logic\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Logic\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.06064\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Logic","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06064","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We introduce a general concept of layered computation model, of which neural
networks are a particular example, and combine tools of topological dynamics
and model theory to study asymptotics of such models. We prove that, as the
number of layers of a computation grows, the computation reaches a state of
``deep equilibrium" which amounts to a single, self-referential layer. After
proving the existence of deep equilibria under fairly general hypotheses, we
characterize their computability.