M. Tok, Rolf Jongebloed, Lieven Lange, Erik Bochinski, T. Sikora
{"title":"一种训练和编码操纵混合专家的MSE方法","authors":"M. Tok, Rolf Jongebloed, Lieven Lange, Erik Bochinski, T. Sikora","doi":"10.1109/PCS.2018.8456250","DOIUrl":null,"url":null,"abstract":"Previous research has shown the interesting properties and potential of Steered Mixtures-of-Experts (SMoE) for image representation, approximation, and compression based on EM optimization. In this paper we introduce an MSE optimization method based on Gradient Descent for training SMoEs. This allows improved optimization towards PSNR and SSIM and de-coupling of experts and gates. In consequence we can now generate very high quality SMoE models with significantly reduced model complexity compared to previous work and much improved edge representations. Uased on this strategy a block-based image coder was developed using Mixture-of-Experts that uses very simple experts with very few model parameters. Experimental evaluations shows that a significant compression gain can be achieved compared to JPEG for low bit rates.","PeriodicalId":433667,"journal":{"name":"2018 Picture Coding Symposium (PCS)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"An MSE Approach For Training And Coding Steered Mixtures Of Experts\",\"authors\":\"M. Tok, Rolf Jongebloed, Lieven Lange, Erik Bochinski, T. Sikora\",\"doi\":\"10.1109/PCS.2018.8456250\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Previous research has shown the interesting properties and potential of Steered Mixtures-of-Experts (SMoE) for image representation, approximation, and compression based on EM optimization. In this paper we introduce an MSE optimization method based on Gradient Descent for training SMoEs. This allows improved optimization towards PSNR and SSIM and de-coupling of experts and gates. In consequence we can now generate very high quality SMoE models with significantly reduced model complexity compared to previous work and much improved edge representations. Uased on this strategy a block-based image coder was developed using Mixture-of-Experts that uses very simple experts with very few model parameters. Experimental evaluations shows that a significant compression gain can be achieved compared to JPEG for low bit rates.\",\"PeriodicalId\":433667,\"journal\":{\"name\":\"2018 Picture Coding Symposium (PCS)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 Picture Coding Symposium (PCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PCS.2018.8456250\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 Picture Coding Symposium (PCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PCS.2018.8456250","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An MSE Approach For Training And Coding Steered Mixtures Of Experts
Previous research has shown the interesting properties and potential of Steered Mixtures-of-Experts (SMoE) for image representation, approximation, and compression based on EM optimization. In this paper we introduce an MSE optimization method based on Gradient Descent for training SMoEs. This allows improved optimization towards PSNR and SSIM and de-coupling of experts and gates. In consequence we can now generate very high quality SMoE models with significantly reduced model complexity compared to previous work and much improved edge representations. Uased on this strategy a block-based image coder was developed using Mixture-of-Experts that uses very simple experts with very few model parameters. Experimental evaluations shows that a significant compression gain can be achieved compared to JPEG for low bit rates.