{"title":"在Hadoop中使用spit进行大数据压缩:多导联心电信号的案例研究","authors":"G. Jati, Ilham Kusuma, M. Hilman, W. Jatmiko","doi":"10.1109/IWBIS.2016.7872902","DOIUrl":null,"url":null,"abstract":"Compression still become main concern in big data framework. The performance of big data depend on speed of data transfer. Compressed data can speed up transfer data between network. It also save more space for storage. Several compression method is provide by Hadoop as a most common big data framework. That method mostly for general purpose. But the performance still have to optimize especially for Biomedical record like ECG data. We propose Set Partitioning in Hierarchical Tree (SPIHT) for big data compression with study case ECG signal data. In this paper compression will run in Hadoop Framework. The proposed method has stages such as input signal, map input signal, spiht coding, and reduce bit-stream. The compression produce compressed data for intermediate (Map) output and final (reduce) output. The experiment using ECG data to measure compression performance. The proposed method gets Percentage Root-mean-square difference (PRD) is about 1.0. Compare to existing method, the proposed method get better Compression Ratio (CR) with competitive longer compression time. So proposed method gets better performance compare to other method especially for ECG dataset.","PeriodicalId":193821,"journal":{"name":"2016 International Workshop on Big Data and Information Security (IWBIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Big data compression using spiht in Hadoop: A case study in multi-lead ECG signals\",\"authors\":\"G. Jati, Ilham Kusuma, M. Hilman, W. Jatmiko\",\"doi\":\"10.1109/IWBIS.2016.7872902\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Compression still become main concern in big data framework. The performance of big data depend on speed of data transfer. Compressed data can speed up transfer data between network. It also save more space for storage. Several compression method is provide by Hadoop as a most common big data framework. That method mostly for general purpose. But the performance still have to optimize especially for Biomedical record like ECG data. We propose Set Partitioning in Hierarchical Tree (SPIHT) for big data compression with study case ECG signal data. In this paper compression will run in Hadoop Framework. The proposed method has stages such as input signal, map input signal, spiht coding, and reduce bit-stream. The compression produce compressed data for intermediate (Map) output and final (reduce) output. The experiment using ECG data to measure compression performance. The proposed method gets Percentage Root-mean-square difference (PRD) is about 1.0. Compare to existing method, the proposed method get better Compression Ratio (CR) with competitive longer compression time. So proposed method gets better performance compare to other method especially for ECG dataset.\",\"PeriodicalId\":193821,\"journal\":{\"name\":\"2016 International Workshop on Big Data and Information Security (IWBIS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 International Workshop on Big Data and Information Security (IWBIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IWBIS.2016.7872902\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Workshop on Big Data and Information Security (IWBIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWBIS.2016.7872902","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Big data compression using spiht in Hadoop: A case study in multi-lead ECG signals
Compression still become main concern in big data framework. The performance of big data depend on speed of data transfer. Compressed data can speed up transfer data between network. It also save more space for storage. Several compression method is provide by Hadoop as a most common big data framework. That method mostly for general purpose. But the performance still have to optimize especially for Biomedical record like ECG data. We propose Set Partitioning in Hierarchical Tree (SPIHT) for big data compression with study case ECG signal data. In this paper compression will run in Hadoop Framework. The proposed method has stages such as input signal, map input signal, spiht coding, and reduce bit-stream. The compression produce compressed data for intermediate (Map) output and final (reduce) output. The experiment using ECG data to measure compression performance. The proposed method gets Percentage Root-mean-square difference (PRD) is about 1.0. Compare to existing method, the proposed method get better Compression Ratio (CR) with competitive longer compression time. So proposed method gets better performance compare to other method especially for ECG dataset.