Yijie Li , Hewei Wang , Shaofan Wang , Jinfeng Xu , Yee Hui Lee , Soumyabrata Dev
{"title":"BSANet:用于实时云分割的双边分离和聚合网络","authors":"Yijie Li , Hewei Wang , Shaofan Wang , Jinfeng Xu , Yee Hui Lee , Soumyabrata Dev","doi":"10.1016/j.rsase.2025.101536","DOIUrl":null,"url":null,"abstract":"<div><div>Segmenting clouds from intensity images is an essential research topic at the intersection of atmospheric science and computer vision, which plays a vital role in weather forecasts and climate evolution analysis. The ground-based sky/cloud image segmentation can help extract the cloud from the original image and analyze the shape or additional features. With the development of deep learning, neural network-based cloud segmentation models can have better performance. In this paper, we introduced a novel sky/cloud segmentation network named Bilateral Segregation and Aggregation Network (BSANet) large version with 4.29 million parameters, which benefits from our designed BSAM, achieving almost the same performance compared with the state-of-the-art method. BSAM uses the rough segmentation map from the previous stage to produce two new weighted feature maps representing the sky and cloud features, and two network branches are utilized to process the features separately. After the deployment via TensorRT, the BSANet-large configuration can achieve 392 fps in FP16 while BSANet-lite with only 90K parameters can achieve 1390 fps, which all exceed real-time standards. Additionally, we proposed a novel and efficient pre-training strategy for sky/cloud segmentation, which can improve segmentation performance when ImageNet pre-training is not available. In the spirit of reproducible research, the model code, dataset, and results of the experiments in this paper are available at: <span><span>https://github.com/Att100/BSANet-cloudseg</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":53227,"journal":{"name":"Remote Sensing Applications-Society and Environment","volume":"38 ","pages":"Article 101536"},"PeriodicalIF":3.8000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"BSANet: A Bilateral Segregation and Aggregation Network for Real-time Cloud Segmentation\",\"authors\":\"Yijie Li , Hewei Wang , Shaofan Wang , Jinfeng Xu , Yee Hui Lee , Soumyabrata Dev\",\"doi\":\"10.1016/j.rsase.2025.101536\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Segmenting clouds from intensity images is an essential research topic at the intersection of atmospheric science and computer vision, which plays a vital role in weather forecasts and climate evolution analysis. The ground-based sky/cloud image segmentation can help extract the cloud from the original image and analyze the shape or additional features. With the development of deep learning, neural network-based cloud segmentation models can have better performance. In this paper, we introduced a novel sky/cloud segmentation network named Bilateral Segregation and Aggregation Network (BSANet) large version with 4.29 million parameters, which benefits from our designed BSAM, achieving almost the same performance compared with the state-of-the-art method. BSAM uses the rough segmentation map from the previous stage to produce two new weighted feature maps representing the sky and cloud features, and two network branches are utilized to process the features separately. After the deployment via TensorRT, the BSANet-large configuration can achieve 392 fps in FP16 while BSANet-lite with only 90K parameters can achieve 1390 fps, which all exceed real-time standards. Additionally, we proposed a novel and efficient pre-training strategy for sky/cloud segmentation, which can improve segmentation performance when ImageNet pre-training is not available. In the spirit of reproducible research, the model code, dataset, and results of the experiments in this paper are available at: <span><span>https://github.com/Att100/BSANet-cloudseg</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":53227,\"journal\":{\"name\":\"Remote Sensing Applications-Society and Environment\",\"volume\":\"38 \",\"pages\":\"Article 101536\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Remote Sensing Applications-Society and Environment\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2352938525000898\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Remote Sensing Applications-Society and Environment","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352938525000898","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
BSANet: A Bilateral Segregation and Aggregation Network for Real-time Cloud Segmentation
Segmenting clouds from intensity images is an essential research topic at the intersection of atmospheric science and computer vision, which plays a vital role in weather forecasts and climate evolution analysis. The ground-based sky/cloud image segmentation can help extract the cloud from the original image and analyze the shape or additional features. With the development of deep learning, neural network-based cloud segmentation models can have better performance. In this paper, we introduced a novel sky/cloud segmentation network named Bilateral Segregation and Aggregation Network (BSANet) large version with 4.29 million parameters, which benefits from our designed BSAM, achieving almost the same performance compared with the state-of-the-art method. BSAM uses the rough segmentation map from the previous stage to produce two new weighted feature maps representing the sky and cloud features, and two network branches are utilized to process the features separately. After the deployment via TensorRT, the BSANet-large configuration can achieve 392 fps in FP16 while BSANet-lite with only 90K parameters can achieve 1390 fps, which all exceed real-time standards. Additionally, we proposed a novel and efficient pre-training strategy for sky/cloud segmentation, which can improve segmentation performance when ImageNet pre-training is not available. In the spirit of reproducible research, the model code, dataset, and results of the experiments in this paper are available at: https://github.com/Att100/BSANet-cloudseg.
期刊介绍:
The journal ''Remote Sensing Applications: Society and Environment'' (RSASE) focuses on remote sensing studies that address specific topics with an emphasis on environmental and societal issues - regional / local studies with global significance. Subjects are encouraged to have an interdisciplinary approach and include, but are not limited by: " -Global and climate change studies addressing the impact of increasing concentrations of greenhouse gases, CO2 emission, carbon balance and carbon mitigation, energy system on social and environmental systems -Ecological and environmental issues including biodiversity, ecosystem dynamics, land degradation, atmospheric and water pollution, urban footprint, ecosystem management and natural hazards (e.g. earthquakes, typhoons, floods, landslides) -Natural resource studies including land-use in general, biomass estimation, forests, agricultural land, plantation, soils, coral reefs, wetland and water resources -Agriculture, food production systems and food security outcomes -Socio-economic issues including urban systems, urban growth, public health, epidemics, land-use transition and land use conflicts -Oceanography and coastal zone studies, including sea level rise projections, coastlines changes and the ocean-land interface -Regional challenges for remote sensing application techniques, monitoring and analysis, such as cloud screening and atmospheric correction for tropical regions -Interdisciplinary studies combining remote sensing, household survey data, field measurements and models to address environmental, societal and sustainability issues -Quantitative and qualitative analysis that documents the impact of using remote sensing studies in social, political, environmental or economic systems