R. West, D. Yocky, Brian J. Redman, J. D. Laan, Dylan Z. Anderson
{"title":"基于概率特征融合的光学和极化SAR数据融合地形分类","authors":"R. West, D. Yocky, Brian J. Redman, J. D. Laan, Dylan Z. Anderson","doi":"10.1109/IGARSS39084.2020.9324022","DOIUrl":null,"url":null,"abstract":"Deciding on an imaging modality for terrain classification can be a challenging problem. For some terrain classes a given sensing modality may discriminate well, but may not have the same performance on other classes that a different sensor may be able to easily separate. The most effective terrain classification will utilize the abilities of multiple sensing modalities. The challenge of utilizing multiple sensing modalities is then determining how to combine the information in a meaningful and useful way. In this paper, we introduce a framework for effectively combining data from optical and polarimetric synthetic aperture radar sensing modalities. We demonstrate the fusion framework for two vegetation classes and two ground classes and show that fusing data from both imaging modalities has the potential to improve terrain classification from either modality, alone.","PeriodicalId":444267,"journal":{"name":"IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Optical and Polarimetric SAR Data Fusion Terrain Classification Using Probabilistic Feature Fusion\",\"authors\":\"R. West, D. Yocky, Brian J. Redman, J. D. Laan, Dylan Z. Anderson\",\"doi\":\"10.1109/IGARSS39084.2020.9324022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deciding on an imaging modality for terrain classification can be a challenging problem. For some terrain classes a given sensing modality may discriminate well, but may not have the same performance on other classes that a different sensor may be able to easily separate. The most effective terrain classification will utilize the abilities of multiple sensing modalities. The challenge of utilizing multiple sensing modalities is then determining how to combine the information in a meaningful and useful way. In this paper, we introduce a framework for effectively combining data from optical and polarimetric synthetic aperture radar sensing modalities. We demonstrate the fusion framework for two vegetation classes and two ground classes and show that fusing data from both imaging modalities has the potential to improve terrain classification from either modality, alone.\",\"PeriodicalId\":444267,\"journal\":{\"name\":\"IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IGARSS39084.2020.9324022\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IGARSS39084.2020.9324022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Optical and Polarimetric SAR Data Fusion Terrain Classification Using Probabilistic Feature Fusion
Deciding on an imaging modality for terrain classification can be a challenging problem. For some terrain classes a given sensing modality may discriminate well, but may not have the same performance on other classes that a different sensor may be able to easily separate. The most effective terrain classification will utilize the abilities of multiple sensing modalities. The challenge of utilizing multiple sensing modalities is then determining how to combine the information in a meaningful and useful way. In this paper, we introduce a framework for effectively combining data from optical and polarimetric synthetic aperture radar sensing modalities. We demonstrate the fusion framework for two vegetation classes and two ground classes and show that fusing data from both imaging modalities has the potential to improve terrain classification from either modality, alone.