Roshan Rajak, Deepu Raveendran, Maruthi Chandrasekhar Bh, S. Medasani
{"title":"基于Hadoop框架的高分辨率卫星图像处理","authors":"Roshan Rajak, Deepu Raveendran, Maruthi Chandrasekhar Bh, S. Medasani","doi":"10.1109/CCEM.2015.16","DOIUrl":null,"url":null,"abstract":"Complex image processing algorithms that require higher computational power with large scale inputs can be processed efficiently using the parallel and distributed processing of Hadoop MapReduce Framework. Hadoop MapReduce is a scalable model which is capable of processing petabytes (1015 order) of data with improved fault tolerance and data parallelism. In this paper we present a MapReduce framework for performing parallel remote sensing satellite data processing using Hadoop and storing the output in HBase. The speedup and performance show that by utilizing Hadoop, we can distribute our workload across different clusters to take advantage of combined processing power on commodity hardware.","PeriodicalId":339923,"journal":{"name":"2015 IEEE International Conference on Cloud Computing in Emerging Markets (CCEM)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":"{\"title\":\"High Resolution Satellite Image Processing Using Hadoop Framework\",\"authors\":\"Roshan Rajak, Deepu Raveendran, Maruthi Chandrasekhar Bh, S. Medasani\",\"doi\":\"10.1109/CCEM.2015.16\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Complex image processing algorithms that require higher computational power with large scale inputs can be processed efficiently using the parallel and distributed processing of Hadoop MapReduce Framework. Hadoop MapReduce is a scalable model which is capable of processing petabytes (1015 order) of data with improved fault tolerance and data parallelism. In this paper we present a MapReduce framework for performing parallel remote sensing satellite data processing using Hadoop and storing the output in HBase. The speedup and performance show that by utilizing Hadoop, we can distribute our workload across different clusters to take advantage of combined processing power on commodity hardware.\",\"PeriodicalId\":339923,\"journal\":{\"name\":\"2015 IEEE International Conference on Cloud Computing in Emerging Markets (CCEM)\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"16\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE International Conference on Cloud Computing in Emerging Markets (CCEM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCEM.2015.16\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Cloud Computing in Emerging Markets (CCEM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCEM.2015.16","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
High Resolution Satellite Image Processing Using Hadoop Framework
Complex image processing algorithms that require higher computational power with large scale inputs can be processed efficiently using the parallel and distributed processing of Hadoop MapReduce Framework. Hadoop MapReduce is a scalable model which is capable of processing petabytes (1015 order) of data with improved fault tolerance and data parallelism. In this paper we present a MapReduce framework for performing parallel remote sensing satellite data processing using Hadoop and storing the output in HBase. The speedup and performance show that by utilizing Hadoop, we can distribute our workload across different clusters to take advantage of combined processing power on commodity hardware.