A Scalable, Cloud-Based Workflow for Spectrally-Attributed ICESat-2 Bathymetry With Application to Benthic Habitat Mapping Using Deep Learning

IF 2.9 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS
Forrest Corcoran, Christopher E. Parrish, Lori A. Magruder, J. P. Swinski
{"title":"A Scalable, Cloud-Based Workflow for Spectrally-Attributed ICESat-2 Bathymetry With Application to Benthic Habitat Mapping Using Deep Learning","authors":"Forrest Corcoran,&nbsp;Christopher E. Parrish,&nbsp;Lori A. Magruder,&nbsp;J. P. Swinski","doi":"10.1029/2024EA003735","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <p>Since the 2018 launch of NASA's ICESat-2 satellite, numerous studies have documented the bathymetric measurement capabilities of the space-based laser altimeter. However, a commonly identified limitation of ICESat-2 bathymetric point clouds is that they lack accompanying spectral reflectance attributes, or even intensity values, which have been found useful for benthic habitat mapping with airborne bathymetric lidar. We present a novel method for extracting bathymetry from ICESat-2 data and automatically adding spectral reflectance values from Sentinel-2 imagery to each detected bathymetric point. This method, which leverages the cloud computing systems Google Earth Engine and NASA's SlideRule Earth, is ideally suited for “big data” projects with ICESat-2 data products. To demonstrate the scalability of our workflow, we collected 3,500 ICESat-2 segments containing approximately 1.4 million spectrally-attributed bathymetric points. We then used this data set to facilitate training of a deep recurrent neural network for classifying benthic habitats at the ICESat-2 photon level. We trained two identical models, one with and one without the spectral attributes, to investigate the benefits of fusing ICESat-2 photons with Sentinel-2. The results show an improvement in model performance of 18 percentage points, based on F1 score. The procedures and source code are publicly available and will enhance the value of the new ICESat-2 bathymetry data product, ATL24, which is scheduled for release in Fall 2024. These procedures may also be applicable to data from NASA's upcoming CASALS mission.</p>\n </section>\n </div>","PeriodicalId":54286,"journal":{"name":"Earth and Space Science","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2024-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1029/2024EA003735","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Earth and Space Science","FirstCategoryId":"89","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1029/2024EA003735","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Since the 2018 launch of NASA's ICESat-2 satellite, numerous studies have documented the bathymetric measurement capabilities of the space-based laser altimeter. However, a commonly identified limitation of ICESat-2 bathymetric point clouds is that they lack accompanying spectral reflectance attributes, or even intensity values, which have been found useful for benthic habitat mapping with airborne bathymetric lidar. We present a novel method for extracting bathymetry from ICESat-2 data and automatically adding spectral reflectance values from Sentinel-2 imagery to each detected bathymetric point. This method, which leverages the cloud computing systems Google Earth Engine and NASA's SlideRule Earth, is ideally suited for “big data” projects with ICESat-2 data products. To demonstrate the scalability of our workflow, we collected 3,500 ICESat-2 segments containing approximately 1.4 million spectrally-attributed bathymetric points. We then used this data set to facilitate training of a deep recurrent neural network for classifying benthic habitats at the ICESat-2 photon level. We trained two identical models, one with and one without the spectral attributes, to investigate the benefits of fusing ICESat-2 photons with Sentinel-2. The results show an improvement in model performance of 18 percentage points, based on F1 score. The procedures and source code are publicly available and will enhance the value of the new ICESat-2 bathymetry data product, ATL24, which is scheduled for release in Fall 2024. These procedures may also be applicable to data from NASA's upcoming CASALS mission.

Abstract Image

利用深度学习绘制底栖生物栖息地地图的可扩展、基于云的 ICESat-2 分光测深工作流程
自 2018 年美国国家航空航天局发射 ICESat-2 卫星以来,许多研究都记录了天基激光测高仪的测深能力。然而,ICESat-2测深点云普遍存在的一个局限性是缺乏相应的光谱反射属性,甚至缺乏强度值,而这些属性对于利用机载测深激光雷达绘制底栖生物栖息地图非常有用。我们提出了一种从 ICESat-2 数据中提取测深数据的新方法,并自动将 Sentinel-2 图像中的光谱反射率值添加到每个检测到的测深点上。该方法利用了云计算系统谷歌地球引擎和美国国家航空航天局的 SlideRule Earth,非常适合使用 ICESat-2 数据产品的 "大数据 "项目。为了证明我们工作流程的可扩展性,我们收集了 3,500 个 ICESat-2 片段,其中包含约 140 万个光谱归属水深测量点。然后,我们利用该数据集促进了深度递归神经网络的训练,以在 ICESat-2 光子水平上对海底栖息地进行分类。我们训练了两个相同的模型,一个包含光谱属性,另一个不包含光谱属性,以研究将 ICESat-2 光子与 Sentinel-2 融合的益处。结果显示,根据 F1 分数,模型性能提高了 18 个百分点。这些程序和源代码都是公开的,它们将提高 ICESat-2 的新测深数据产品 ATL24 的价值,该产品计划于 2024 年秋季发布。这些程序也可能适用于 NASA 即将执行的 CASALS 任务的数据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Earth and Space Science
Earth and Space Science Earth and Planetary Sciences-General Earth and Planetary Sciences
CiteScore
5.50
自引率
3.20%
发文量
285
审稿时长
19 weeks
期刊介绍: Marking AGU’s second new open access journal in the last 12 months, Earth and Space Science is the only journal that reflects the expansive range of science represented by AGU’s 62,000 members, including all of the Earth, planetary, and space sciences, and related fields in environmental science, geoengineering, space engineering, and biogeochemistry.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信