{"title":"Polarization-driven camouflaged object detection: a multimodal fusion network with iterative polarimetric feature enhancement.","authors":"Xiangyue Zhang, Jingyu Ru, Yihang Wang, Chengdong Wu","doi":"10.1364/AO.570214","DOIUrl":null,"url":null,"abstract":"<p><p>The performance degradation of camouflaged object detection (COD) under complex backgrounds and dynamic illumination conditions has become a challenging issue in optical imaging and detection. To address the limitation of traditional visible-light imaging methods, which easily fail due to their inability to differentiate material and surface optical properties, a polarization-driven multimodal fusion network (PMFNet) is proposed in this paper. High-precision COD is achieved through iterative enhancement of polarization features. First, a feature rectification module is designed based on polarization differences induced by the surface scattering properties of objects. Second, a polarization-guided iterative refinement mechanism is developed, dynamically correcting texture degradation in RGB modality by employing high-resolution polarization features. Finally, a polarization adaptive fusion module is introduced to achieve context-aware complementary enhancement of RGB features through refined polarization information, thus deeply fusing complementary features of the two modalities. The proposed PMFNet demonstrates robust detection performance under adverse illumination and complex background conditions. Experimental results on public datasets demonstrate that the proposed PMFNet outperforms state-of-the-art COD methods.</p>","PeriodicalId":101299,"journal":{"name":"Applied optics","volume":"64 27","pages":"7899-7913"},"PeriodicalIF":0.0000,"publicationDate":"2025-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied optics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1364/AO.570214","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The performance degradation of camouflaged object detection (COD) under complex backgrounds and dynamic illumination conditions has become a challenging issue in optical imaging and detection. To address the limitation of traditional visible-light imaging methods, which easily fail due to their inability to differentiate material and surface optical properties, a polarization-driven multimodal fusion network (PMFNet) is proposed in this paper. High-precision COD is achieved through iterative enhancement of polarization features. First, a feature rectification module is designed based on polarization differences induced by the surface scattering properties of objects. Second, a polarization-guided iterative refinement mechanism is developed, dynamically correcting texture degradation in RGB modality by employing high-resolution polarization features. Finally, a polarization adaptive fusion module is introduced to achieve context-aware complementary enhancement of RGB features through refined polarization information, thus deeply fusing complementary features of the two modalities. The proposed PMFNet demonstrates robust detection performance under adverse illumination and complex background conditions. Experimental results on public datasets demonstrate that the proposed PMFNet outperforms state-of-the-art COD methods.