Yun Liu;Sifan Li;Zihan Liu;Haiyuan Wang;Daoxin Fan
{"title":"BPGI: A Brain-Perception Guided Interactive Network for Stereoscopic Omnidirectional Image Quality Assessment","authors":"Yun Liu;Sifan Li;Zihan Liu;Haiyuan Wang;Daoxin Fan","doi":"10.1109/OJID.2025.3610449","DOIUrl":null,"url":null,"abstract":"Stereoscopic omnidirectional image quality assessment is a combination task of stereoscopic image quality assessment and omnidirectional image quality assessment, which is more challenging than traditional three-dimensional images. Previous works fail to present a satisfying performance due to neglecting human brain perception mechanism. To solve the above problem, we proposed an effective brain-perception guided interactive network for stereoscopic omnidirectional image quality assessment (BPGI), which is built following three perception steps: visual information processing, feature fusion cognition, and quality evaluation. Considering the stereoscopic perception characteristics, binocular and monocular visual features are both extracted. Following human complex cognition mechanism, a Bi-LSTM module is introduced to dig the deeply inherent relationship between monocular and binocular visual feature and improve the feature representation ability of the proposed model. Then a visual feature fusion module is built to obtain effective interactive fusion for quality prediction. Experimental results prove that the proposed model outperforms many state-of-the-art models, and can be effectively applied to predict the quality of stereoscopic omnidirectional images.","PeriodicalId":100634,"journal":{"name":"IEEE Open Journal on Immersive Displays","volume":"2 ","pages":"81-88"},"PeriodicalIF":0.0000,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11165215","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal on Immersive Displays","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11165215/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Stereoscopic omnidirectional image quality assessment is a combination task of stereoscopic image quality assessment and omnidirectional image quality assessment, which is more challenging than traditional three-dimensional images. Previous works fail to present a satisfying performance due to neglecting human brain perception mechanism. To solve the above problem, we proposed an effective brain-perception guided interactive network for stereoscopic omnidirectional image quality assessment (BPGI), which is built following three perception steps: visual information processing, feature fusion cognition, and quality evaluation. Considering the stereoscopic perception characteristics, binocular and monocular visual features are both extracted. Following human complex cognition mechanism, a Bi-LSTM module is introduced to dig the deeply inherent relationship between monocular and binocular visual feature and improve the feature representation ability of the proposed model. Then a visual feature fusion module is built to obtain effective interactive fusion for quality prediction. Experimental results prove that the proposed model outperforms many state-of-the-art models, and can be effectively applied to predict the quality of stereoscopic omnidirectional images.