{"title":"Multi-Resolution-Based Deep Learning Approach for Rice Field Monitoring","authors":"Yasir Afaq, Ankush Manocha","doi":"10.1080/07038992.2021.2010036","DOIUrl":"https://doi.org/10.1080/07038992.2021.2010036","url":null,"abstract":"Abstract In India, agribusiness is directly dependent on the precise monitoring of paddy areas to take considerable supportive actions toward food security. For this, satellite-based data is considered one of the effective solutions. The goal of this study is to design an intelligent framework to determine the crop area by using satellite data that is easily available. In this article, a Multi-resolution Deep Neural Network (MR-DNN) is proposed to determine rice fields by performing multi-streaming classification. The task of prediction is performed on Landsat 8 satellite images with high spatial resolution. The prediction performance of the proposed model is justified by comparing the calculated outcomes from a few selected methods. The proposed model has achieved the highest prediction performance in terms of the F1 score with the accuracy of 95.40% and 95.12% for Punjab and West-Bengal dataset as compared to the selected models, such as DeepLabV3+, Convolutional Neural Network (CNN), Support Vector Machine (SVM), Random Forest (RF), Light-Gradient Boosting Method (LGBM), eXtreme Gradient Boosting (XGBoost), Spectral, and Threshold. In this manner, the empirical evaluation defines the prediction performance of the proposed model over the visual interpretation of the maps as well as seasonal impacts.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48075865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Caio de Lima, D. Saqui, S. A. T. Mpinda, J. H. Saito
{"title":"Pix2Pix Network to Estimate Agricultural Near Infrared Images from RGB Data","authors":"Daniel Caio de Lima, D. Saqui, S. A. T. Mpinda, J. H. Saito","doi":"10.1080/07038992.2021.2016056","DOIUrl":"https://doi.org/10.1080/07038992.2021.2016056","url":null,"abstract":"Abstract Remote sensing has been applied to agriculture, making it possible to acquire a large amount of data far away from crops, providing information for decision making by producers that can impact production costs and crops quality. One way of getting the production information is through vegetation indices, arithmetic operations that use spectral bands, especially the Near Infrared (NIR). However, sensors that capture this spectral information are very expensive for small producers to afford it. In a previous article, a pixel-to-pixel image synthesis model to estimate NIR images from RGB data using hyperspectral endmembers (pure hyperspectral signatures) was described. In this work, an image-to-image synthesis model, known as Pix2Pix, is used for estimating NIR images from low-cost RGB camera images. Pix2Pix is a kind of Generative Adversarial Networks (GANs), composed by two neural networks, a generator (G) and a discriminator (D), that compete. G learns to create images from a random noise inputs and D learns to verify if these images are real or fake. The results showed that the presented method generated NIR images quite similar to real ones, reaching a value of 0.912 on M3SIM similarity metric, outperforming results obtained with the previous endmembers method (0.775 on M3SIM).","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42034202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hazhir Bahrami, Saeid Homayouni, H. Mcnairn, M. Hosseini, M. Mahdianpari
{"title":"Regional Crop Characterization Using Multi-Temporal Optical and Synthetic Aperture Radar Earth Observations Data","authors":"Hazhir Bahrami, Saeid Homayouni, H. Mcnairn, M. Hosseini, M. Mahdianpari","doi":"10.1080/07038992.2021.2011180","DOIUrl":"https://doi.org/10.1080/07038992.2021.2011180","url":null,"abstract":"Abstract Crop biophysical parameters, such as Leaf Area Index (LAI) and biomass, are essential for estimating crop productivity, yield modeling, and agronomic management. This study used several features extracted from multi-temporal Sentinel-1 Synthetic Aperture Radar (SAR) and spectral vegetation indices extracted from Sentinel-2 optical data to estimate crop LAI and wet and dry biomass. Various machine learning algorithms, including Random Forest Regression (RFR), Support Vector Regression (SVR), and Artificial Neural Network (ANN), were trained and assessed for three major crops (wheat, soybeans and canola). ANN provided the best accuracy for all wheat parameters and soybean LAI and canola wet biomass and LAI. RFR led to higher accuracy for soybean dry and wet biomass. However, SVR could accurately estimate only canola dry biomass. All data were then pooled to investigate if a single algorithm could estimate biophysical parameters for all crops. The RFR model accurately estimated wet and dry biomass and LAI across all crop types in this scenario. This generic model is fast and accurate and can be easily applied for crop mapping and monitoring over large geographies using cloud computing platforms, such as Google Earth Engine.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43144627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Asset Akhmadiya, K. Moldamurat, M. Jamshidi, S. Brimzhanova, N. Nabiyev, Aigerim Kismanova
{"title":"Application of Sentinel-1 SAR Data for Detecting a Nuclear Test Location in North Korea","authors":"Asset Akhmadiya, K. Moldamurat, M. Jamshidi, S. Brimzhanova, N. Nabiyev, Aigerim Kismanova","doi":"10.1080/07038992.2021.2025348","DOIUrl":"https://doi.org/10.1080/07038992.2021.2025348","url":null,"abstract":"Abstract Sentinel-1 C-band radar data were applied for the first time to determine a nuclear test, its underground H-bomb explosion location and the affected zone in North Korea on September 3, 2017. The nuclear test location was found according to line-of-sight displacement images via its maximum value. In this research, three scenes of Sentinel-1B data acquired in descending orbits, one after and two before the event (the nuclear test date), were used to detect the nuclear test location. The nuclear test location was found northeast of the Punggye-ri nuclear test site (8 km) with the following geographic coordinates: 41°11′1.85″N, 129°13′28.86″E. It was revealed that only a pair of Sentinel-1В radar scenes from August 17 and September 10 have been successfully applied to detect nuclear test zones.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45523059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aaron Meneghini, Parinaz Rahimzadeh-Bajgiran, W. Livingston, A. Weiskittel
{"title":"Detecting White Pine Needle Damage through Satellite Remote Sensing","authors":"Aaron Meneghini, Parinaz Rahimzadeh-Bajgiran, W. Livingston, A. Weiskittel","doi":"10.1080/07038992.2021.2023317","DOIUrl":"https://doi.org/10.1080/07038992.2021.2023317","url":null,"abstract":"Abstract Eastern white pines (Pinus strobus L.) of New England forests have been recently impacted by a fungal disease known as White Pine Needle Damage (WPND), causing widespread needle damage. To complement current WPND monitoring methods based on field and aerial detection surveys, we evaluated the potential of satellite remote sensing technology to detect WPND outbreaks. Using Sentinel-2 spectral vegetation indices (SVIs), we directly visualized change overlapping WPND outbreaks and ran Random Forest machine learning classifiers for feature selection and WPND detection and severity classification. Direct visualization of WPND associated change was most effective through the Normalized Difference Infrared Index (NDII), which captured decreases in vegetation health conditions coinciding with peak WPND symptoms. We obtained good accuracies in binary (WPND vs. Non-WPND) detection (70%) and two-class severity modeling of WPND (75%). The highest accuracies were achieved using imagery from early to late summer. The most selected SVIs for modeling were the Carotenoid Reflectance Index1 (CRI1), the Sentinel-2 Red-Edge Position (S2REP), and the Normalized Difference Vegetation Index (NDVI). Our results suggest detecting severe WPND through fine resolution remote sensing is feasible. However, more work is needed to determine the effects of spatial, spectral, and temporal resolution of remote sensing data for detecting WPND severity levels.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43666820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"UAV and High Resolution Satellite Mapping of Forage Lichen (Cladonia spp.) in a Rocky Canadian Shield Landscape","authors":"R. H. Fraser, D. Pouliot, Jurjen van der Sluijs","doi":"10.1080/07038992.2021.1908118","DOIUrl":"https://doi.org/10.1080/07038992.2021.1908118","url":null,"abstract":"Abstract Reindeer lichens (Cladonia spp.) are an important food source for woodland and barren ground caribou herds. In this study, we assessed Cladonia classification accuracy in a rocky, Canadian Shield landscape near Yellowknife, Northwest Territories using both Unmanned Aerial Vehicle (UAV) sensors and high-resolution satellite sensors. At the UAV scale, random forest classifications derived from a multispectral, visible-near infrared sensor (Micasense Altum) had an average 5% higher accuracy for mapping Cladonia (i.e., 95.5%) than when using a conventional color RGB camera (DJI Phantom 4 RTK). We aggregated Altum lichen classifications from three 5 ha study sites to train random forest regression models of fractional lichen cover using predictor features from WorldView-3 and Planet CubeSat satellite imagery. WorldView models at 6 m resolution had an average 6.8% RMSE (R 2 = 0.61) when tested at independent study sites and outperformed the 6 m Planet models, which had a 9.9% RMSE (R 2 = 0.34). These satellite results are comparable to previous lichen mapping studies focusing on woodlands, but the small cover of Cladonia in our study area (11.6% or 16.8% within the barren portions) results in a high relative RMSE (62.2%) expressed as a proportion of mean lichen cover.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/07038992.2021.1908118","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41697287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"41st Canadian Symposium on Remote Sensing Special Issue: A Virtual Conference","authors":"C. Hopkinson, C. Coburn, L. Chasmer","doi":"10.1080/07038992.2022.2024683","DOIUrl":"https://doi.org/10.1080/07038992.2022.2024683","url":null,"abstract":"Dedication This Special Issue is dedicated to the memory of our friend and colleague, Dr. Martin Isenburg. Martin made valuable and colorful contributions to our symposium by hosting a workshop and giving a video presentation from his home and ’laser’ chicken farm in Costa Rica. The creator of the widely popular LAStools software, and an avid traveler and trainer in the international lidar community, he fell victim to the global pandemic in 2021. He will be sadly missed by all whose lives he touched.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46107737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spatially Explicit Abundance Modeling of a Highly Specialized Wetland Bird Using Sentinel-1 and Sentinel-2 Modélisation spatialement explicite de l’abondance d’un oiseau très spécifique aux zones humides à l’aide de Sentinel-1 et de Sentinel-2","authors":"L. McLeod, Evan R. DeLancey, Erin M. Bayne","doi":"10.1080/07038992.2021.2014797","DOIUrl":"https://doi.org/10.1080/07038992.2021.2014797","url":null,"abstract":"Abstract Yellow Rail (Coturnicops noveboracensis) are a highly specialized wetland obligate bird. They are a species at risk in Canada and very little is known about their abundance in the wetlands of the western boreal forest. Emerging technologies have enabled us to effectively survey for Yellow Rail and other wetland birds in remote areas by using ground-based remote sensors (autonomous recording units; ARUs) to conduct passive acoustic monitoring. We analyzed bird data from the first four years (2013–2016) of an ongoing monitoring program led by the Bioacoustic Unit at the Alberta Biodiversity Monitoring Institute. We developed species abundance models using satellite data from Sentinel-1 and Sentinel-2 processed in Google Earth Engine. We identified covariates from both synthetic aperture radar and optical remote sensing that had strong predictive capacity for this wetland bird (AUC = 0.96). Approximately 1.5% of available wetland habitat in our northeast Alberta study area was predicted to be highly suitable for Yellow Rail.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45286735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Testing ASTER and Sentinel-2 MSI Images to Discriminate Igneous and Metamorphic Rock Units in the Chadormalu Paleocrater, Central Iran","authors":"A. Moghtaderi, F. Moore, Hojatollah Ranjbar","doi":"10.1080/07038992.2021.1997347","DOIUrl":"https://doi.org/10.1080/07038992.2021.1997347","url":null,"abstract":"Abstract In the last fifty years, satellite images have been used to map the Earth’s surface at a variety of scales. Two satellite multispectral sensors (Sentinel-2 MSI and ASTER) have great utility for lithological discrimination in areas of good rock exposures. This study was conducted in order to test the ability of these sensors to discriminate igneous and metamorphic lithologies in the Chadormalu paleocrater and evaluate the image types and processing methodologies. The MNF (Minimum Noise Fraction) transform, Mathematical Evaluation Method (MEM), Spectral Angle Mapper (SAM), Mixture Tuned Matched Filter (MTMF), and band ratios were performed on near and short wave infrared ASTER and Sentinel-2 bands. Comparison of the results from several methods demonstrates that the MEM method can detect lithological units with very low false detection and better matching with ground truth data. Moreover, this study indicates that the results produced by the MEM algorithm on Sentinel-2 MSI data are more accurate than the results produced with ASTER data in the same area. Therefore, the MEM algorithm seems to be well suited for image classification involving multispectral databases such as ASTER and Sentinel-2 images.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2021-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43563285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Shokr, M. Dabboor, Mélanie Lacelle, Tom Zagon, B. Deschamps
{"title":"Observations from C-Band SAR Fully Polarimetric Parameters of Mobile Sea Ice Based on Radar Scattering Mechanisms to Support Operational Sea Ice Monitoring","authors":"M. Shokr, M. Dabboor, Mélanie Lacelle, Tom Zagon, B. Deschamps","doi":"10.1080/07038992.2021.2003701","DOIUrl":"https://doi.org/10.1080/07038992.2021.2003701","url":null,"abstract":"Abstract Fully polarimetric (FP) SAR systems offer parameters that describe and quantify the scattering mechanisms from the surface cover. These are usually derived from decomposition of matrices derived from the original scattering matrix from observations at each pixel. Power from scattering mechanisms have potential for retrieval of sea ice information, which cannot be derived using traditional backscatter (magnitude or phase) measured by single- or dual-polarization SAR systems. This study investigates the potential of selected FP parameters that represent the power of three scattering mechanisms, in addition to the total power, in identifying ice types and surface features for operational use. Parameters were obtained from a set of 62 RADARSAT-2 Quad-pol data over Resolute Passage, central Arctic, during the period September-December 2017. A scattering-based color-composite scheme was developed. Analysis of the examined color images was supported by information from regional ice charts and SAR image interpretations from the Canadian Ice Service. Case studies are presented to demonstrate the potential of the proposed color-composite tool. Open water, new ice, multi-year ice and a few surface features including rafted, ridged and smooth/rough surfaces can be identified better in the color images. Physical interpretation of the relative power from the given scattering mechanisms is explained for the relevant ice types and surfaces.","PeriodicalId":48843,"journal":{"name":"Canadian Journal of Remote Sensing","volume":null,"pages":null},"PeriodicalIF":2.6,"publicationDate":"2021-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42077336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}