{"title":"畸变对MDL模型的影响","authors":"Yoram Gronich, R. Zamir","doi":"10.1109/ITW.2001.955169","DOIUrl":null,"url":null,"abstract":"We investigate the consequences of lossy compression, i.e., description with distortion, on the model selection of the minimum description length (MDL) criterion. Our basic observation is that for a finite data sequence and sufficiently large distortion, a two-stage universal lossy encoder tends to under-estimate the model order of the source. We demonstrate this property by examining the behavior of a two-stage universal lossy encoder, based on pre/post-filtered entropy-coded dithered quantization, over some parametric classes of stationary Gaussian sources.","PeriodicalId":288814,"journal":{"name":"Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"The effect of distortion on the MDL model\",\"authors\":\"Yoram Gronich, R. Zamir\",\"doi\":\"10.1109/ITW.2001.955169\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We investigate the consequences of lossy compression, i.e., description with distortion, on the model selection of the minimum description length (MDL) criterion. Our basic observation is that for a finite data sequence and sufficiently large distortion, a two-stage universal lossy encoder tends to under-estimate the model order of the source. We demonstrate this property by examining the behavior of a two-stage universal lossy encoder, based on pre/post-filtered entropy-coded dithered quantization, over some parametric classes of stationary Gaussian sources.\",\"PeriodicalId\":288814,\"journal\":{\"name\":\"Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494)\",\"volume\":\"68 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2001-09-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITW.2001.955169\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITW.2001.955169","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We investigate the consequences of lossy compression, i.e., description with distortion, on the model selection of the minimum description length (MDL) criterion. Our basic observation is that for a finite data sequence and sufficiently large distortion, a two-stage universal lossy encoder tends to under-estimate the model order of the source. We demonstrate this property by examining the behavior of a two-stage universal lossy encoder, based on pre/post-filtered entropy-coded dithered quantization, over some parametric classes of stationary Gaussian sources.