{"title":"基于跨尺度信息融合的定向目标检测","authors":"Chen Li, Tongzhou Zhao, Chengbo Mao, Wei Hu","doi":"10.1109/AICIT55386.2022.9930256","DOIUrl":null,"url":null,"abstract":"Due to the enormous size discrepancies between classes and within classes, as well as the high degree of resemblance across classes of oriented objects, traditional remote sensing object detection was made difficult. Although coping with huge scale differences and high inter-class similarity was made possible by multi-scale information fusion, the multiscale weight fusion technique neglected the impact of cross-scale on picture semantic feature extraction, leading to subpar detection performance. The performance of the delayed inference was caused by the rotating region proposal network, which produced high-quality ideas while expanding the network’s capacity. In this study, a cross-scale shift oriented object detection method was suggested. First, by creating a feature pyramid network, the multi-layer feature maps were successfully fused. First, the multi-layer feature maps were effectively fused by reconstructing a feature pyramid network. A cross-scale shift module was simultaneously introduced to FPN to enhance the correlation between multi-scale properties. Finally, to raise the quality of the bounding boxes produced, an oriented region proposal network (ORPN) was used. On remote sensing datasets from DOTA-V1.5, the proposed method fared better than the control group in terms of detection accuracy.","PeriodicalId":231070,"journal":{"name":"2022 International Conference on Artificial Intelligence and Computer Information Technology (AICIT)","volume":"94 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Oriented object detection based on cross-scale information fusion\",\"authors\":\"Chen Li, Tongzhou Zhao, Chengbo Mao, Wei Hu\",\"doi\":\"10.1109/AICIT55386.2022.9930256\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Due to the enormous size discrepancies between classes and within classes, as well as the high degree of resemblance across classes of oriented objects, traditional remote sensing object detection was made difficult. Although coping with huge scale differences and high inter-class similarity was made possible by multi-scale information fusion, the multiscale weight fusion technique neglected the impact of cross-scale on picture semantic feature extraction, leading to subpar detection performance. The performance of the delayed inference was caused by the rotating region proposal network, which produced high-quality ideas while expanding the network’s capacity. In this study, a cross-scale shift oriented object detection method was suggested. First, by creating a feature pyramid network, the multi-layer feature maps were successfully fused. First, the multi-layer feature maps were effectively fused by reconstructing a feature pyramid network. A cross-scale shift module was simultaneously introduced to FPN to enhance the correlation between multi-scale properties. Finally, to raise the quality of the bounding boxes produced, an oriented region proposal network (ORPN) was used. On remote sensing datasets from DOTA-V1.5, the proposed method fared better than the control group in terms of detection accuracy.\",\"PeriodicalId\":231070,\"journal\":{\"name\":\"2022 International Conference on Artificial Intelligence and Computer Information Technology (AICIT)\",\"volume\":\"94 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-09-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Artificial Intelligence and Computer Information Technology (AICIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AICIT55386.2022.9930256\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Artificial Intelligence and Computer Information Technology (AICIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AICIT55386.2022.9930256","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Oriented object detection based on cross-scale information fusion
Due to the enormous size discrepancies between classes and within classes, as well as the high degree of resemblance across classes of oriented objects, traditional remote sensing object detection was made difficult. Although coping with huge scale differences and high inter-class similarity was made possible by multi-scale information fusion, the multiscale weight fusion technique neglected the impact of cross-scale on picture semantic feature extraction, leading to subpar detection performance. The performance of the delayed inference was caused by the rotating region proposal network, which produced high-quality ideas while expanding the network’s capacity. In this study, a cross-scale shift oriented object detection method was suggested. First, by creating a feature pyramid network, the multi-layer feature maps were successfully fused. First, the multi-layer feature maps were effectively fused by reconstructing a feature pyramid network. A cross-scale shift module was simultaneously introduced to FPN to enhance the correlation between multi-scale properties. Finally, to raise the quality of the bounding boxes produced, an oriented region proposal network (ORPN) was used. On remote sensing datasets from DOTA-V1.5, the proposed method fared better than the control group in terms of detection accuracy.