{"title":"Is there a general structure for grammars?","authors":"D. Mumford","doi":"10.1109/CVPRW.2009.5204334","DOIUrl":null,"url":null,"abstract":"Summary form only given. Linguists have proposed dozens of formalisms for grammars and now vision is weighing in with its versions based on its needs. Ulf Grenander has proposed general pattern theory, and has used grammar-like graphical parses of \"thoughts\" in the style of AI. One wants a natural, simple formalism treating all these cases. I want to pose this as a central problem in modeling intelligence. Pattern theory started in the 70's with the ideas of Ulf Grenander and his school at Brown. The aim is to analyze from a statistical point of view the patterns in all \"signals\" generated by the world, whether they be images, sounds, written text, DNA or protein strings, spike trains in neurons, time series of prices or weather, etc. Pattern theory proposes that the types of patterns-and the hidden variables needed to describe these patterns - found in one class of signals will often be found in the others and that their characteristic variability will be similar. The underlying idea is to find classes of stochastic models which can capture all the patterns that we see in nature, so that random samples from these models have the same \"look and feel\" as the samples from the world itself. Then the detection of patterns in noisy and ambiguous samples can be achieved by the use of Bayes' rule, a method that can be described as \"analysis by synthesis\".","PeriodicalId":431981,"journal":{"name":"2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops","volume":"119 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPRW.2009.5204334","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Summary form only given. Linguists have proposed dozens of formalisms for grammars and now vision is weighing in with its versions based on its needs. Ulf Grenander has proposed general pattern theory, and has used grammar-like graphical parses of "thoughts" in the style of AI. One wants a natural, simple formalism treating all these cases. I want to pose this as a central problem in modeling intelligence. Pattern theory started in the 70's with the ideas of Ulf Grenander and his school at Brown. The aim is to analyze from a statistical point of view the patterns in all "signals" generated by the world, whether they be images, sounds, written text, DNA or protein strings, spike trains in neurons, time series of prices or weather, etc. Pattern theory proposes that the types of patterns-and the hidden variables needed to describe these patterns - found in one class of signals will often be found in the others and that their characteristic variability will be similar. The underlying idea is to find classes of stochastic models which can capture all the patterns that we see in nature, so that random samples from these models have the same "look and feel" as the samples from the world itself. Then the detection of patterns in noisy and ambiguous samples can be achieved by the use of Bayes' rule, a method that can be described as "analysis by synthesis".