Aylin Tuzcu Kokal, Meltem Kaçıkoç, N. Musaoğlu, Aysegul Tanik
{"title":"Remote Sensing Application in Water Quality of Lake Burdur, Türkiye","authors":"Aylin Tuzcu Kokal, Meltem Kaçıkoç, N. Musaoğlu, Aysegul Tanik","doi":"10.14358/pers.23-00040r2","DOIUrl":"https://doi.org/10.14358/pers.23-00040r2","url":null,"abstract":"The advancements in space technology have facilitated water quality (WQ) monitoring of lake conditions at a spatial resolution of 10 m by freely accessible Sentinel-2 images. The main aim of this article was to elucidate the necessity of spatiotemporal WQ monitoring of the shrinking\u0000 Lake Burdur in Türkiye by examining the relation between field and satellite data with a state-of-the-art machine learning- based regression algorithm. This study focuses on detection of algal blooms and WQ parameters, which are chlorophyll-a (Chl-a) and suspended solids (SS). Furthermore,\u0000 this study leverages the advantage of geographic position of Lake Burdur, located at the overlap of two Sentinel-2 frames, which enables the acquisition of satellite images at a temporal resolution of 2–3 days. The findings enrich the understanding of the lake's dynamic structure by\u0000 rapidly monitoring the occurrence of algal blooms. High accuracies were achieved for Chl-a (R-squared: 0.93) and SS (R-squared: 0.94) detection.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"24 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139883852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Dynowski, A. Źróbek-Sokolnik, Marta Czaplicka, Adam Senetra
{"title":"The Sight-Aesthetic Value of the Underwater Landscapes of Lakes in the Context of Exploration Tourism","authors":"P. Dynowski, A. Źróbek-Sokolnik, Marta Czaplicka, Adam Senetra","doi":"10.14358/pers.23-00054r2","DOIUrl":"https://doi.org/10.14358/pers.23-00054r2","url":null,"abstract":"The aim of the study is to identify factors affecting the sight-aesthetic value of the underwater landscapes of lakes for the purposes of exploration tourism. The reason for undertaking this topic is the lack of such studies for inland water bodies. The results will contribute to expanding\u0000 and supplementing the knowledge on the assessment of the sight-aesthetic attractiveness of landscapes and fill gaps in knowledge about the underwater landscapes of lakes. The questionnaire survey implemented the direct comparison method described by Kendall (Kendall, M. G. 1970. Rank Correlation\u0000 Methods. Charles Griffin and Co: Glasgow, Scotland). According to respondents, animals and submerged anthropogenic elements are the most visually attractive in an aquatic environment The results obtained are the reason for conducting further research and developing the methodology for\u0000 the assessment of the sight-aesthetic value of inland bodies of water based on the experience of terrestrial landscape researchers.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"116 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139825041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Introduction to Pointcloudmetry by Mathias Lemmens","authors":"Toby M. Terpstra","doi":"10.14358/pers.90.2.81","DOIUrl":"https://doi.org/10.14358/pers.90.2.81","url":null,"abstract":"","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"42 12","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139872408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dual-branch Branch Networks Based on Contrastive Learning for Long-Tailed Remote Sensing","authors":"Lei Zhang, Lijia Peng, Pengfei Xia, Chuyuan Wei, Chengwei Yang, Yanyan Zhang","doi":"10.14358/pers.23-00055r2","DOIUrl":"https://doi.org/10.14358/pers.23-00055r2","url":null,"abstract":"Deep learning has been widely used in remote sensing image classification and achieves many excellent results. These methods are all based on relatively balanced data sets. However, in real-world scenarios, many data sets belong to the long-tailed distribution, resulting in poor performance. In view of the good performance of contrastive learning in long-tailed image classification, a new dual-branch fusion learning classification model is proposed to fuse the discriminative features of remote sensing images with spatial data, making full use of valuable image representation information in imbalance data. This paper also presents a hybrid loss, which solves the problem of poor discrimination of extracted features caused by large intra-class variation and inter-class ambiguity. Extended experiments on three long-tailed remote sensing image classification data sets demonstrate the advantages of the proposed dual-branch model based on contrastive learning in long-tailed image classification.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"37 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139125679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GIS Tips & Tricks Slivers be Gone!","authors":"Savannah Carter, Al Karlin","doi":"10.14358/pers.90.1.5","DOIUrl":"https://doi.org/10.14358/pers.90.1.5","url":null,"abstract":"One of the most annoying aspects of building large polygon datasets by heads-up digitizing occurs when there are small overlaps and/or gaps where the polygons meet. Edge-matching to eliminate slivers between digitized polygons can be a laborious and tedious task. These \"slivers\", especially voids, can be very difficult to detect by visual means, so the GIS workflow to resolve these issues generally involves building topology, constructing a ruleset, and running advanced GIS tools; a heady operation for a beginning GIS analyst and particularly cumbersome when tracking a few slivers. This month's GIS Tip demonstrates a quick and effective workflow to avoid the build topology route.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"14 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139128213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Terrain Complexity and Maximal Poisson-Disk Sampling-Based Digital Elevation Model Simplification","authors":"Jingxian Dong, Fan Ming, Twaha Kabika, Jiayao Jiang, Siyuan Zhang, Aliaksandr Chervan, Zhukouskaya Natallia, Wenguang Hou","doi":"10.14358/pers.23-00023r2","DOIUrl":"https://doi.org/10.14358/pers.23-00023r2","url":null,"abstract":"With the rapid development of lidar, the accuracy and density of the Digital Elevation Model (DEM) point clouds have been continuously improved. However, in some applications, dense point cloud has no practical meaning. How to effectively sample from the dense points and maximize the preservation of terrain features is extremely important. This paper will propose a DEM sampling algorithm that utilizes terrain complexity and maximal Poisson-disk sampling to extract key feature points for adaptive DEM sampling. The algorithm estimates terrain complexity based on local terrain variation and prioritizes points with high complexity for sampling. The sampling radius is inversely proportional to terrain complexity, while ensuring that points within the radius of accepted samples are not considered new samples. This way makes more points of concern in the rugged regions. The results show that the proposed algorithm has higher global accuracy than the classic six sampling methods.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"45 8","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139125246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of Soil-Suppressed Impervious Surface Area Index for Automatic Urban Mapping","authors":"Akib Javed, Zhenfeng Shao, Iffat Ara, Muhammad Nasar Ahmad, Enamul Huq, Nayyer Saleem, Fazlul Karim","doi":"10.14358/pers.23-00043r2","DOIUrl":"https://doi.org/10.14358/pers.23-00043r2","url":null,"abstract":"Expanding urban impervious surface area (ISA) mapping is crucial to sustainable development, urban planning, and environmental studies. Multispectral ISA mapping is challenging because of the mixed-pixel problems with bare soil. This study presents a novel approach using spectral and temporal information to develop a Soil-Suppressed Impervious Surface Area Index (SISAI) using the Landsat Operational Land Imager (OLI) data set, which reduces the soil but enhances the ISA signature. This study mapped the top 12 populated megacities using SISAI and achieved an over-all accuracy of 0.87 with an F1-score of 0.85. It also achieved a higher Spatial Dissimilarity Index between the ISA and bare soil. However, it is limited by bare gray soil and shadows of clouds and hills. SISAI encourages urban dynamics and inter-urban compari- son studies owing to its automatic and unsupervised methodology.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"14 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139125784","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"I2-FaçadeNet: An Illumination-invariant Façade Recognition Network Leveraging Sparsely Gated Mixture of Multi-color Space Experts for Aerial Oblique Imagery","authors":"Shengzhi Huang, Han Hu, Qing Zhu","doi":"10.14358/pers.23-00033r2","DOIUrl":"https://doi.org/10.14358/pers.23-00033r2","url":null,"abstract":"Façade image recognition under complex illumination conditions is crucial for various applications, including urban three-dimensional modeling and building identification. Existing methods relying solely on Red-Green-Blue (RGB) images are prone to texture ambiguity in complex illumination environments. Furthermore, façades display varying orientations and camera viewing angles, resulting in performance issues within the RGB color space. In this study, we introduce an illumination-invariant façade recognition network (I2-FaçadeNet) that leverages sparsely gated multi-color space experts for enhanced façade image recognition in challenging illumination environments. First, RGB façade images are converted into multi-color spaces to eliminate the ambiguous texture in complex illumination. Second, we train expert networks using separate channels of multi-color spaces. Finally, a sparsely gated mechanism is introduced to manage the expert networks, enabling dynamic activation of expert networks and the merging of results. Experimental evaluations leveraging both the International Society for Photogrammetry and Remote Sensing benchmark data sets and the Shenzhen data sets reveal that our proposed I2 -FaçadeNet surpasses various depths of ResNet in façade recognition under complex illumination conditions. Specifically, the classification accuracy for poorly illuminated façades in Zurich improves by nearly 8%, while the accuracy for over-illuminated areas in Shenzhen increases by approximately 3%. Moreover, ablation studies conducted on façade images with complex illumination indicate that compared to traditional RGB-based ResNet, the proposed network achieves an accuracy improvement of 3% to 4% up to 100% for overexposed images and an accuracy improvement of 3% to 10% for underexposed images.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"4 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139127550","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Comparison of 3D Point Cloud Completion Networks for High Altitude Lidar Scans of Buildings","authors":"M. Kulawiak","doi":"10.14358/pers.23-00056r2","DOIUrl":"https://doi.org/10.14358/pers.23-00056r2","url":null,"abstract":"High altitude lidar scans allow for rapid acquisition of big spatial data representing entire city blocks. Unfortunately, the raw point clouds acquired by this method are largely incomplete due to object occlusions and restrictions in scanning angles and sensor resolution, which can negatively affect the obtained results. In recent years, many new solutions for 3D point cloud completion have been created and tested on various objects; however, the application of these methods to high-altitude lidar point clouds of buildings has not been properly investigated yet. In the above context, this paper presents the results of applying several state-of-the-art point cloud completion networks to various building exteriors acquired by simulated airborne laser scanning. Moreover, the output point clouds generated from partial data are compared with complete ground-truth point clouds. The performed tests show that the SeedFormer network trained on the ShapeNet-55 data set provides promising shape completion results.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":"18 19","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139126359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tianjiao Liu, Sibo Duan, Jiankui Chen, Li Zhang, Dong Li, Xuqing Li
{"title":"Rice Identification Under Complex Surface Conditions with CNN and Integrated Remote Sensing Spectral-Temporal-Spatial Features","authors":"Tianjiao Liu, Sibo Duan, Jiankui Chen, Li Zhang, Dong Li, Xuqing Li","doi":"10.14358/pers.23-00036r2","DOIUrl":"https://doi.org/10.14358/pers.23-00036r2","url":null,"abstract":"Accurate and effective rice identification has great significance for the sustainable development of agricultural management and food security. This paper proposes an accurate rice identification method that can solve the confused problem between fragmented rice fields and the surroundings\u0000 in complex surface areas. The spectral, temporal, and spatial features extracted from the created Sentinel-2 time series were integrated and collaboratively displayed in the form of visual images, and a convolutional neural network model embedded with integrated information was established\u0000 to further mine the key information that distinguishes rice from other types. The results showed that the overall accuracy, precision, recall, and F1-score of the proposed method for rice identification reached 99.4%, 99.5%, 99.5%, and 99.5%, respectively, achieving\u0000 a better performance than the support vector machine classifier. Therefore, the proposed method can effectively reduce the confusion between rice and other types and accurately extract rice distribution information under complex surface conditions.","PeriodicalId":211256,"journal":{"name":"Photogrammetric Engineering & Remote Sensing","volume":" 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138619056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}