A prediction-based neural network scheme for lossless data compression

R. Logeswaran
{"title":"A prediction-based neural network scheme for lossless data compression","authors":"R. Logeswaran","doi":"10.1109/TSMCC.2002.806744","DOIUrl":null,"url":null,"abstract":"This paper proposes a modified block-adaptive prediction-based neural network scheme for lossless data compression. A variety of neural network models from a selection of different network types, including feedforward, recurrent, and radial basis configurations are implemented with the scheme. The scheme is further expanded with combinations of popular lossless encoding algorithms. Simulation results are presented, taking characteristic features of the models, transmission issues, and practical considerations into account to determine optimized configuration, suitable training strategies, and implementation schemes. Estimations are used for comparisons of these characteristics with the existing schemes. It is also shown that the adaptations of the improvised scheme increases performance of even the classical predictors evaluated. In addition, the results obtained support that the total processing time of the two-stage scheme can, in certain cases, be faster than just using lossless encoders. Findings of the paper may be beneficial for future work, such as, in the hardware implementations of dedicated neural chips for lossless compression.","PeriodicalId":55005,"journal":{"name":"IEEE Transactions on Systems Man and Cybernetics Part C-Applications and Re","volume":"7 1","pages":"358-365"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Systems Man and Cybernetics Part C-Applications and Re","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TSMCC.2002.806744","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

This paper proposes a modified block-adaptive prediction-based neural network scheme for lossless data compression. A variety of neural network models from a selection of different network types, including feedforward, recurrent, and radial basis configurations are implemented with the scheme. The scheme is further expanded with combinations of popular lossless encoding algorithms. Simulation results are presented, taking characteristic features of the models, transmission issues, and practical considerations into account to determine optimized configuration, suitable training strategies, and implementation schemes. Estimations are used for comparisons of these characteristics with the existing schemes. It is also shown that the adaptations of the improvised scheme increases performance of even the classical predictors evaluated. In addition, the results obtained support that the total processing time of the two-stage scheme can, in certain cases, be faster than just using lossless encoders. Findings of the paper may be beneficial for future work, such as, in the hardware implementations of dedicated neural chips for lossless compression.
一种基于预测的无损数据压缩神经网络方案
提出了一种改进的基于块自适应预测的神经网络无损数据压缩方案。该方案实现了来自不同网络类型的各种神经网络模型,包括前馈、循环和径向基配置。该方案进一步扩展了流行的无损编码算法的组合。给出了仿真结果,考虑了模型的特征、传输问题和实际考虑因素,确定了优化配置、合适的训练策略和实现方案。估计用于将这些特性与现有方案进行比较。结果还表明,即兴方案的适应性提高了经典预测器的性能。此外,研究结果表明,在某些情况下,两阶段方案的总处理时间比仅使用无损编码器要快。本文的发现可能对未来的工作有益,例如,用于无损压缩的专用神经芯片的硬件实现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
1
审稿时长
3 months
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信