Hao Zhai, Xin Pan, You Yang, Jinyuan Jiang, Qing Li
{"title":"Two-Stage Focus Measurement Network with Joint Boundary Refinement for Multifocus Image Fusion","authors":"Hao Zhai, Xin Pan, You Yang, Jinyuan Jiang, Qing Li","doi":"10.1155/2023/4155948","DOIUrl":null,"url":null,"abstract":"Focus measurement, one of the key tasks in multifocus image fusion (MFIF) frameworks, identifies the clearer parts of multifocus images pairs. Most of the existing methods aim to achieve disposable pixel-level focus measurement. However, the lack of sufficient accuracy often gives rise to misjudgments in the results. To this end, a novel two-stage focus measurement with joint boundary refinement network is proposed for MFIF. In this work, we adopt a coarse-to-fine strategy to gradually achieve block-level and pixel-level focus measurement for producing more fine-grained focus probability maps, instead of directly predicting at the pixel level. In addition, the joint boundary refinement optimizes the performance on the focused/defocused boundary component (FDB) during the focus measurement. To improve feature extraction capability, both CNN and transformer are employed to, respectively, encode local patterns and capture long-range dependencies. Then, the features from two input branches are legitimately aggregated by modeling the spatial complementary relationship in each pair of multifocus images. Extensive experiments demonstrate that the proposed model achieves state-of-the-art performance in both subjective perception and objective assessment.","PeriodicalId":14089,"journal":{"name":"International Journal of Intelligent Systems","volume":"2023 1","pages":"1-16"},"PeriodicalIF":5.0000,"publicationDate":"2023-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1155/2023/4155948","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Focus measurement, one of the key tasks in multifocus image fusion (MFIF) frameworks, identifies the clearer parts of multifocus images pairs. Most of the existing methods aim to achieve disposable pixel-level focus measurement. However, the lack of sufficient accuracy often gives rise to misjudgments in the results. To this end, a novel two-stage focus measurement with joint boundary refinement network is proposed for MFIF. In this work, we adopt a coarse-to-fine strategy to gradually achieve block-level and pixel-level focus measurement for producing more fine-grained focus probability maps, instead of directly predicting at the pixel level. In addition, the joint boundary refinement optimizes the performance on the focused/defocused boundary component (FDB) during the focus measurement. To improve feature extraction capability, both CNN and transformer are employed to, respectively, encode local patterns and capture long-range dependencies. Then, the features from two input branches are legitimately aggregated by modeling the spatial complementary relationship in each pair of multifocus images. Extensive experiments demonstrate that the proposed model achieves state-of-the-art performance in both subjective perception and objective assessment.
期刊介绍:
The International Journal of Intelligent Systems serves as a forum for individuals interested in tapping into the vast theories based on intelligent systems construction. With its peer-reviewed format, the journal explores several fascinating editorials written by today''s experts in the field. Because new developments are being introduced each day, there''s much to be learned — examination, analysis creation, information retrieval, man–computer interactions, and more. The International Journal of Intelligent Systems uses charts and illustrations to demonstrate these ground-breaking issues, and encourages readers to share their thoughts and experiences.