Multi-modal DEM super-resolution using relative depth: A new benchmark and beyond

IF 8.6 Q1 REMOTE SENSING
Wenjun Huang, Qun Sun, Wenyue Guo, Qing Xu, Bowei Wen, Tian Gao, Anzhu Yu
{"title":"Multi-modal DEM super-resolution using relative depth: A new benchmark and beyond","authors":"Wenjun Huang,&nbsp;Qun Sun,&nbsp;Wenyue Guo,&nbsp;Qing Xu,&nbsp;Bowei Wen,&nbsp;Tian Gao,&nbsp;Anzhu Yu","doi":"10.1016/j.jag.2025.104865","DOIUrl":null,"url":null,"abstract":"<div><div>Learning-based Digital Elevation Model (DEM) super-resolution (SR) remains a challenge due to the complexity of real-world terrains. Existing approaches typically treat DEMs as digital grids or triangulated irregular networks, solving numerical fitting problems to densify points through learning models. However, these methods often overlook the spatial context and structural textures inherent in the terrain. To address this limitation, we propose utilizing relative depth maps derived from open-source remote sensing images by a foundational Depth Anything Model (DAM), which provide complementary structural information about the terrain and enhance the elevation details in DEMs. A novel DEMSR dataset, DEM-OPT-Depth SR, is constructed, pairing open-source remote sensing images, DEMs, and their corresponding relative depth maps. Additionally, we present a benchmark method, the Multi-modal Fusion Super-Resolution (MFSR) network, which extracts features through multi-branch pseudo-siamese networks and performs multi-scale feature fusion. Extensive experiments on the DEM-OPT-Depth SR dataset demonstrate a 24.63% improvement in RMSE-Elevation, a 22.05% improvement in RMSE-Slope, and an 11.44% improvement in RMSE-Aspect, showing the superiority and generalization capabilities of the MFSR model over previously proposed state-of-the-art baselines in DEMSR tasks. The code and dataset can be accessed at <span><span>https://github.com/hwj0711/MFSR</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"144 ","pages":"Article 104865"},"PeriodicalIF":8.6000,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843225005126","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

Learning-based Digital Elevation Model (DEM) super-resolution (SR) remains a challenge due to the complexity of real-world terrains. Existing approaches typically treat DEMs as digital grids or triangulated irregular networks, solving numerical fitting problems to densify points through learning models. However, these methods often overlook the spatial context and structural textures inherent in the terrain. To address this limitation, we propose utilizing relative depth maps derived from open-source remote sensing images by a foundational Depth Anything Model (DAM), which provide complementary structural information about the terrain and enhance the elevation details in DEMs. A novel DEMSR dataset, DEM-OPT-Depth SR, is constructed, pairing open-source remote sensing images, DEMs, and their corresponding relative depth maps. Additionally, we present a benchmark method, the Multi-modal Fusion Super-Resolution (MFSR) network, which extracts features through multi-branch pseudo-siamese networks and performs multi-scale feature fusion. Extensive experiments on the DEM-OPT-Depth SR dataset demonstrate a 24.63% improvement in RMSE-Elevation, a 22.05% improvement in RMSE-Slope, and an 11.44% improvement in RMSE-Aspect, showing the superiority and generalization capabilities of the MFSR model over previously proposed state-of-the-art baselines in DEMSR tasks. The code and dataset can be accessed at https://github.com/hwj0711/MFSR.
使用相对深度的多模态DEM超分辨率:一个新的基准及超越
由于现实世界地形的复杂性,基于学习的数字高程模型(DEM)超分辨率(SR)仍然是一个挑战。现有的方法通常将dem视为数字网格或不规则三角网,通过学习模型来解决密度点的数值拟合问题。然而,这些方法往往忽略了地形固有的空间背景和结构纹理。为了解决这一限制,我们建议通过基础深度模型(DAM)利用来自开源遥感图像的相对深度图,该模型提供了有关地形的补充结构信息,并增强了dem中的高程细节。本文构建了一个新的DEM-OPT-Depth SR数据集,将开源遥感图像、dem及其相应的相对深度图进行配对。此外,我们还提出了一种基于多分支伪连体网络提取特征并进行多尺度特征融合的基准方法——多模态融合超分辨率(MFSR)网络。在DEM-OPT-Depth SR数据集上进行的大量实验表明,RMSE-Elevation提高了24.63%,RMSE-Slope提高了22.05%,RMSE-Aspect提高了11.44%,显示了MFSR模型在DEMSR任务中优于先前提出的最先进基线的优势和泛化能力。代码和数据集可以在https://github.com/hwj0711/MFSR上访问。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信