{"title":"Function estimation via wavelets for data with long-range dependence","authors":"Y. Wang","doi":"10.1109/WITS.1994.513927","DOIUrl":"https://doi.org/10.1109/WITS.1994.513927","url":null,"abstract":"Traditionally, processes with long-range dependence have been mathematically awkward to manipulate. This has made the solution of many of the classical signal processing problems involving these processes rather difficult. For a fractional Gaussian noise model, we derive asymptotics for minimax risks and show that wavelet estimates can achieve minimax over a wide range of spaces. This article also establishes a wavelet-vaguelette decomposition (WVD) to decorrelate fractional Gaussian noise.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127143779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bayes risk-weighted vector quantization","authors":"R. Gray","doi":"10.1109/WITS.1994.513847","DOIUrl":"https://doi.org/10.1109/WITS.1994.513847","url":null,"abstract":"Lossy compression and classification algorithms both attempt to reduce a large collection of possible observations into a few representative categories so as to preserve essential information. A framework for combining classification and compression into one or two quantizers is described along with some examples and related to other quantizer-based classification schemes.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"27 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120983536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"When is the weak rate equal to the strong rate?","authors":"P. Shields","doi":"10.1109/WITS.1994.513858","DOIUrl":"https://doi.org/10.1109/WITS.1994.513858","url":null,"abstract":"A condition on a class of processes guaranteeing that the weak redundancy rate has the same asymptotic order of magnitude as the strong redundancy rate will be discussed.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"146 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126337338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Coding for distributed computation","authors":"L. Schulman","doi":"10.1109/WITS.1994.513866","DOIUrl":"https://doi.org/10.1109/WITS.1994.513866","url":null,"abstract":"Summary form only given. The author describes analogous coding theorems for the more general, interactive, communications required in computation. In this case the bits transmitted in the protocol are not known to the processors in advance but are determined dynamically. First he shows that any interactive protocol of length T between two processors connected by a noiseless channel can be simulated, if the channel is noisy (a binary symmetric channel of capacity C), in time proportional to T 1/C, and with error probability exponentially small in T. He then shows that this result can be extended to arbitrary distributed network protocols. He shows that any distributed protocol which runs in time T on a network of degree d having noiseless communication channels, can, if the channels are in fact noisy, be simulated on that network in time proportional to T 1/C log d. The probability of failure of the protocol is exponentially small in T.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121939799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Asymptotically optimal model selection and neural nets","authors":"A. Barron","doi":"10.1109/WITS.1994.513871","DOIUrl":"https://doi.org/10.1109/WITS.1994.513871","url":null,"abstract":"A minimum description length criterion for inference of functions in both parametric and nonparametric settings is determined. By adapting the parameter precision, a description length criterion can take on the form log(likelihood)+const/spl middot/m instead of the familiar -log(likelihood)+(m/2)log n where m is the number of parameters and n is the sample size. For certain regular models the criterion yields asymptotically optimal rates for coding redundancy and statistical risk. Moreover, the convergence is adaptive in the sense that the rates are simultaneously minimax optimal in various parametric and nonparametric function classes without prior knowledge of which function class contains the true function. This one criterion combines positive benefits of information-theoretic criteria proposed by Rissanen, Akaike, and Schwarz. A reviewed is also includes of how the minimum description length principle provides accurate estimates in irregular models such as neural nets.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122314647","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
O. Mayora-Ibarra, A. González-Gutiérrez, J. Ruiz-Suárez
{"title":"Neural networks for error correction of Hamming codes","authors":"O. Mayora-Ibarra, A. González-Gutiérrez, J. Ruiz-Suárez","doi":"10.1109/WITS.1994.513921","DOIUrl":"https://doi.org/10.1109/WITS.1994.513921","url":null,"abstract":"A comparative analysis of three neural network models: backpropagation (BPP), bidirectional associative memory (BAM) and holographic associative memory (HAM); and a classical method for error-correction is presented. Each method is briefly described, results are reported and finally some advantages are concluded.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"47 9","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120922710","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An asymptotic property of model selection criteria","authors":"Yuhong Yang, A. Barron","doi":"10.1109/WITS.1994.513930","DOIUrl":"https://doi.org/10.1109/WITS.1994.513930","url":null,"abstract":"Probability models are estimated by use of penalized likelihood criteria related to the Akaike (1972) information criteria (AIC) and the minimum description length (MDL). The asymptotic risk of the density estimator is determined, under conditions on the penalty term, and is shown to be minimax optimal. As an application, we show that the optimal rate of convergence is achieved for the density in certain smooth nonparametric families without knowing the smooth parameters in advance.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130747050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sample path description of Gauss Markov random fields","authors":"S. Goswami, José M. F. Moura","doi":"10.1109/WITS.1994.513893","DOIUrl":"https://doi.org/10.1109/WITS.1994.513893","url":null,"abstract":"We provide a characterization of Gauss Markov random fields in terms of partial differential equations with random forcing term. Our method consists of obtaining a concrete representation of an abstract stochastic partial differential equation using some results from the theory of vector measures.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"739 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115131825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Selection of best bases for classification and regression","authors":"R. Coifman, N. Saito","doi":"10.1109/WITS.1994.513882","DOIUrl":"https://doi.org/10.1109/WITS.1994.513882","url":null,"abstract":"We describe extensions to the \"best-basis\" method to select orthonormal bases suitable for signal classification (or regression) problems from a collection of orthonormal bases using the relative entropy (or regression errors). Once these bases are selected, the most significant coordinates are fed into a traditional classifier (or regression method) such as linear discriminant analysis (LDA) or a classification and regression tree (CART). The performance of these statistical methods is enhanced since the proposed methods reduce the dimensionality of the problems by using the basis functions which are well-localized in the time-frequency plane as feature extractors.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129186285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multiresolution models for random fields and their use in statistical image processing","authors":"H. Krim, A. Willsky, W. Karl","doi":"10.1109/WITS.1994.513887","DOIUrl":"https://doi.org/10.1109/WITS.1994.513887","url":null,"abstract":"We describe a probabilistic framework for optimal multiresolution processing and analysis of spatial phenomena. Our developed multiresolution (MR) models are useful in describing random processes and fields. The scale recursive nature of the resulting models, leads to extremely efficient algorithms for optimal estimation and likelihood calculation. These models, which are described, have also provided a framework for data fusion, and produced new solutions to problems in computer vision (optical flow estimation), remote sensing (oceanography where dimensional complexity is in thousands), and various inverse problems of mathematical physics.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127729074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}