Yu Zhang, Qiu Shi, C. Gao, Qian Yonggang, Chuanrong Li
{"title":"measurement of Beijing’s economic development by nighttime light using Suomi-NPP VIIRS DNB data","authors":"Yu Zhang, Qiu Shi, C. Gao, Qian Yonggang, Chuanrong Li","doi":"10.1117/12.2599921","DOIUrl":"https://doi.org/10.1117/12.2599921","url":null,"abstract":"New satellite images of the Earth at night can be achieved with earth observation by continuous remote sensing throughout all day. The images give the most complete view of contemporary global human settlement, especially cities. Beijing is the capital, one of the most important and typical big cities of China. This study estimated economic activities of Beijing using remote sensing nighttime earth surface light data from S-NPP VIIRS DNB night data with corrections, with a focus on the relationship between economic index and city lights. Our study aims to eliminate the influence of cloud, moonlight and atmosphere on artificial light sources at night, in order to achieve more accurate inversion of ground artificial light source information. The results proved that there is a strong linear regression relationship between corrected DNB nighttime data and GDP with 0.79405 fitting coefficient, which was higher than the linear fitting coefficient (0.2817) of average radiance composite images data and GDP. The linear fitting coefficient of the tertiary industry and corrected DNB nighttime data is 0.76102 is higher than 0.1836 of the tertiary industry and average radiance composite images data. Therefore, the approach was provided for the dynamic evaluation of social and economic data, and the developed urban light fusion product will lay a foundation for the derivative application of backend and the inversion and application of night light data in other locations.","PeriodicalId":103787,"journal":{"name":"Remote Sensing Technologies and Applications in Urban Environments VI","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134444718","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Applying super resolution to low resolution images for monitoring transmission lines","authors":"Tomonori Yamamoto, Yu Zhao, Sonoko Kimura, Taminori Tomita, Shinji Matsuda, Norihiko Moriwaki","doi":"10.1117/12.2599724","DOIUrl":"https://doi.org/10.1117/12.2599724","url":null,"abstract":"It is important for the electricity transmission and distribution (TD hence it has a potential to replace the helicopter surveillance. Sentinel-2 imagery is one of the most famous satellite imageries with completely free of charge, however, its spatial resolution is relatively lower than high-cost satellite imagery such as PlanetScope or WorldView-3. In this research, we explored the effectiveness of super resolution. The refinement of spatial resolution from 10m/pix to 3.3m/pix (x3 SR) seemed to be extremely useful to assess trigonometric risk assessment, which leveraged the number of the pixels between transmission line and vegetation, and tree height information at the vegetation pixels. We employed the deep learning based super resolution model RDN (Residual Dense Network) to upsample the Sentinel-2 images. The training data is generated from the PlanetScope imagery whose resolution is 3.7m/pix. Deep learning based super resolution is generally effective to get 2-4 times finer resolution, therefore, the PlanetScope imagery is suitable to obtain the RDN model for x3 super resolution. We evaluated the performance of vegetation segmentation performance with and without super resolution in the areas along the transmission line. The experimental results showed that the imagery with super resolution yielded better result than the result without super resolution by 9.3% in weighted F1-score.","PeriodicalId":103787,"journal":{"name":"Remote Sensing Technologies and Applications in Urban Environments VI","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133104257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optical remote sensing for urban flood applications: Canadian case studies","authors":"Ying Zhang","doi":"10.1117/12.2599630","DOIUrl":"https://doi.org/10.1117/12.2599630","url":null,"abstract":"","PeriodicalId":103787,"journal":{"name":"Remote Sensing Technologies and Applications in Urban Environments VI","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133675348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}