T. Quaife, E. Pinnington, P. Marzahn, T. Kaminski, M. Vossbeck, J. Timmermans, C. Isola, B. Rommen, A. Loew
{"title":"Synergistic retrievals of leaf area index and soil moisture from Sentinel-1 and Sentinel-2","authors":"T. Quaife, E. Pinnington, P. Marzahn, T. Kaminski, M. Vossbeck, J. Timmermans, C. Isola, B. Rommen, A. Loew","doi":"10.1080/19479832.2022.2149629","DOIUrl":null,"url":null,"abstract":"ABSTRACT Joint retrieval of vegetation status from synthetic aperture radar (SAR) and optical data holds much promise due to the complimentary of the information in the two wavelength domains. SAR penetrates the canopy and includes information about the water status of the soil and vegetation, whereas optical data contains information about the amount and health of leaves. However, due to inherent complexities of combining these data sources there has been relatively little progress in joint retrieval of information over vegetation canopies. In this study, data from Sentinel–1 and Sentinel–2 were used to invert coupled radiative transfer models to provide synergistic retrievals of leaf area index and soil moisture. Results for leaf area are excellent and enhanced by the use of both data sources (RSME is always less than and has a correlation of better than when using both together), but results for soil moisture are mixed with joint retrievals generally showing the lowest RMSE but underestimating the variability of the field data. Examples of such synergistic retrieval of plant properties from optical and SAR data using physically based radiative transfer models are uncommon in the literature, but these results highlight the potential for this approach.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":"14 1","pages":"225 - 242"},"PeriodicalIF":1.8000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2022.2149629","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 1
Abstract
ABSTRACT Joint retrieval of vegetation status from synthetic aperture radar (SAR) and optical data holds much promise due to the complimentary of the information in the two wavelength domains. SAR penetrates the canopy and includes information about the water status of the soil and vegetation, whereas optical data contains information about the amount and health of leaves. However, due to inherent complexities of combining these data sources there has been relatively little progress in joint retrieval of information over vegetation canopies. In this study, data from Sentinel–1 and Sentinel–2 were used to invert coupled radiative transfer models to provide synergistic retrievals of leaf area index and soil moisture. Results for leaf area are excellent and enhanced by the use of both data sources (RSME is always less than and has a correlation of better than when using both together), but results for soil moisture are mixed with joint retrievals generally showing the lowest RMSE but underestimating the variability of the field data. Examples of such synergistic retrieval of plant properties from optical and SAR data using physically based radiative transfer models are uncommon in the literature, but these results highlight the potential for this approach.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).