{"title":"贪婪深度字典学习用于高光谱图像分类","authors":"Snigdha Tariyal, H. Aggarwal, A. Majumdar","doi":"10.1109/WHISPERS.2016.8071740","DOIUrl":null,"url":null,"abstract":"In this work we propose a new deep learning tool — deep dictionary learning. We give an alternate neural network type interpretation to dictionary learning. Based on this, we build a deep architecture by cascading one dictionary after the other. The learning proceeds in a greedy fashion, therefore for each level we only need to learn a single layer of dictionary — time tested tools are there to solve this problem. We compare our approach to the deep belief network (DBN) and stacked autoencoder (SAE) based techniques for hyperspectral image classification. We find that in the practical scenario, when the training data is limited, our method outperforms the more established tools like SAE and DBN.","PeriodicalId":369281,"journal":{"name":"2016 8th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Greedy deep dictionary learning for hyperspectral image classification\",\"authors\":\"Snigdha Tariyal, H. Aggarwal, A. Majumdar\",\"doi\":\"10.1109/WHISPERS.2016.8071740\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this work we propose a new deep learning tool — deep dictionary learning. We give an alternate neural network type interpretation to dictionary learning. Based on this, we build a deep architecture by cascading one dictionary after the other. The learning proceeds in a greedy fashion, therefore for each level we only need to learn a single layer of dictionary — time tested tools are there to solve this problem. We compare our approach to the deep belief network (DBN) and stacked autoencoder (SAE) based techniques for hyperspectral image classification. We find that in the practical scenario, when the training data is limited, our method outperforms the more established tools like SAE and DBN.\",\"PeriodicalId\":369281,\"journal\":{\"name\":\"2016 8th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 8th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WHISPERS.2016.8071740\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 8th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WHISPERS.2016.8071740","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Greedy deep dictionary learning for hyperspectral image classification
In this work we propose a new deep learning tool — deep dictionary learning. We give an alternate neural network type interpretation to dictionary learning. Based on this, we build a deep architecture by cascading one dictionary after the other. The learning proceeds in a greedy fashion, therefore for each level we only need to learn a single layer of dictionary — time tested tools are there to solve this problem. We compare our approach to the deep belief network (DBN) and stacked autoencoder (SAE) based techniques for hyperspectral image classification. We find that in the practical scenario, when the training data is limited, our method outperforms the more established tools like SAE and DBN.