A novel approach for high-velocity big geo-data handling using iterative and feature learning algorithms

Sana Rekik, Sami Faïz
{"title":"A novel approach for high-velocity big geo-data handling using iterative and feature learning algorithms","authors":"Sana Rekik, Sami Faïz","doi":"10.1109/FMEC49853.2020.9144893","DOIUrl":null,"url":null,"abstract":"Geospatial data were exclusively generated by official agencies. However, following the technological revolution in data collection and production, various sources have emerged for the massive production of geospatial data, resulting the phenomenon of big geo-data. Therefore, dealing with large amounts of these data sets, results in a high velocity as they change very quickly, is a challenging task. Hence, analysis become more complex and computation become prohibitively expensive. As a result, spatial computing technologies become limited in front of these complex data and operations. Accordingly, we aimed to refine complexity with simplicity by replacing traditional geospatial models with referring to the simplest intelligent and minimum resource requirement algorithms that can be applied against these constraints, while ensuring the criteria of performance and scalability. In this work, we focus on the high-velocity of this big geo-data through the use of an iterative approach applied to a feature learning algorithms to decrease the memory consumption and the time complexity of traditional machine learning algorithms. According to our knowledge, although they were widely applied in the 19th century as a solution to overcome the problems of limitation of memory and computing resources. Iterative methods were still not used for the big geo-data analytics and generally for the big data domain. Thus, this approach could be beneficial especially for real time applications such as the anomaly monitoring and detection.","PeriodicalId":110283,"journal":{"name":"2020 Fifth International Conference on Fog and Mobile Edge Computing (FMEC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Fifth International Conference on Fog and Mobile Edge Computing (FMEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FMEC49853.2020.9144893","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Geospatial data were exclusively generated by official agencies. However, following the technological revolution in data collection and production, various sources have emerged for the massive production of geospatial data, resulting the phenomenon of big geo-data. Therefore, dealing with large amounts of these data sets, results in a high velocity as they change very quickly, is a challenging task. Hence, analysis become more complex and computation become prohibitively expensive. As a result, spatial computing technologies become limited in front of these complex data and operations. Accordingly, we aimed to refine complexity with simplicity by replacing traditional geospatial models with referring to the simplest intelligent and minimum resource requirement algorithms that can be applied against these constraints, while ensuring the criteria of performance and scalability. In this work, we focus on the high-velocity of this big geo-data through the use of an iterative approach applied to a feature learning algorithms to decrease the memory consumption and the time complexity of traditional machine learning algorithms. According to our knowledge, although they were widely applied in the 19th century as a solution to overcome the problems of limitation of memory and computing resources. Iterative methods were still not used for the big geo-data analytics and generally for the big data domain. Thus, this approach could be beneficial especially for real time applications such as the anomaly monitoring and detection.
一种使用迭代和特征学习算法的高速大地理数据处理新方法
地理空间数据完全由官方机构生成。然而,随着数据收集和生产的技术革命,出现了各种各样的地理空间数据大规模生产的来源,产生了大地理数据现象。因此,处理大量这些数据集是一项具有挑战性的任务,因为它们变化非常快,导致速度很快。因此,分析变得更加复杂,计算变得异常昂贵。因此,空间计算技术在这些复杂的数据和操作面前变得有限。因此,我们的目标是通过参考最简单的智能和资源需求最小的算法来取代传统的地理空间模型,以简化复杂性,同时确保性能和可扩展性的标准。在这项工作中,我们通过使用应用于特征学习算法的迭代方法来降低传统机器学习算法的内存消耗和时间复杂度,专注于这种大地理数据的高速传输。据我们所知,尽管它们在19世纪被广泛应用,作为克服内存和计算资源限制问题的解决方案。迭代方法仍然没有用于大地理数据分析,通常用于大数据领域。因此,这种方法尤其适用于异常监视和检测等实时应用程序。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信