{"title":"Reference-Based Iterative Interaction With P2-Matching for Stereo Image Super-Resolution","authors":"Runmin Cong;Rongxin Liao;Feng Li;Ronghui Sheng;Huihui Bai;Renjie Wan;Sam Kwong;Wei Zhang","doi":"10.1109/TIP.2025.3577538","DOIUrl":null,"url":null,"abstract":"Stereo Image Super-Resolution (SSR) holds great promise in improving the quality of stereo images by exploiting the complementary information between left and right views. Most SSR methods primarily focus on the inter-view correspondences in low-resolution (LR) space. The potential of referencing a high-quality SR image of one view benefits the SR for the other is often overlooked, while those with abundant textures contribute to accurate correspondences. Therefore, we propose Reference-based Iterative Interaction (RIISSR), which utilizes reference-based iterative pixel-wise and patch-wise matching, dubbed <inline-formula> <tex-math>$P^{2}$ </tex-math></inline-formula>-Matching, to establish cross-view and cross-resolution correspondences for SSR. Specifically, we first design the information perception block (IPB) cascaded in parallel to extract hierarchical contextualized features for different views. Pixel-wise matching is embedded between two parallel IPBs to exploit cross-view interaction in LR space. Iterative patch-wise matching is then executed by utilizing the SR stereo pair as another mutual reference, capitalizing on the cross-scale patch recurrence property to learn high-resolution (HR) correspondences for SSR performance. Moreover, we introduce the supervised side-out modulator (SSOM) to re-weight local intra-view features and produce intermediate SR images, which seamlessly bridge two matching mechanisms. Experimental results demonstrate the superiority of RIISSR against existing state-of-the-art methods.","PeriodicalId":94032,"journal":{"name":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","volume":"34 ","pages":"3779-3789"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11033676/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Stereo Image Super-Resolution (SSR) holds great promise in improving the quality of stereo images by exploiting the complementary information between left and right views. Most SSR methods primarily focus on the inter-view correspondences in low-resolution (LR) space. The potential of referencing a high-quality SR image of one view benefits the SR for the other is often overlooked, while those with abundant textures contribute to accurate correspondences. Therefore, we propose Reference-based Iterative Interaction (RIISSR), which utilizes reference-based iterative pixel-wise and patch-wise matching, dubbed $P^{2}$ -Matching, to establish cross-view and cross-resolution correspondences for SSR. Specifically, we first design the information perception block (IPB) cascaded in parallel to extract hierarchical contextualized features for different views. Pixel-wise matching is embedded between two parallel IPBs to exploit cross-view interaction in LR space. Iterative patch-wise matching is then executed by utilizing the SR stereo pair as another mutual reference, capitalizing on the cross-scale patch recurrence property to learn high-resolution (HR) correspondences for SSR performance. Moreover, we introduce the supervised side-out modulator (SSOM) to re-weight local intra-view features and produce intermediate SR images, which seamlessly bridge two matching mechanisms. Experimental results demonstrate the superiority of RIISSR against existing state-of-the-art methods.