{"title":"基于球面谐波和邻域视图积分的潮滩环境视图合成","authors":"Huilin Ge;Zhiyu Zhu;Biao Wang;Runbang Liu;Denghao Yang;Zhiwen Qiu","doi":"10.23919/cje.2024.00.158","DOIUrl":null,"url":null,"abstract":"We present a novel view synthesis method that introduces radial field representation of density and tidal flat appearance in neural rendering. Our method aims to generate realistic images from new viewpoints by using continuous scene information generated from different sampling points along a set of identical rays. This approach significantly improves rendering quality and reduces blurring and aliasing artifacts compared to existing techniques such as Nerfacto. Our model employs the spherical harmonic function to efficiently encode viewpoint orientation information and integrates image features from neighboring viewpoints for enhanced fusion. This results in an accurate and detailed reconstruction of the scene's geometry and appearance. We evaluate our approach on publicly available datasets containing a variety of indoor and outdoor scenes, as well as on customized tidal flats datasets. The results show that our algorithm outperforms Nerfacto in terms of PSNR (peak signal-to-noise ratio), SSIM (structural similarity index measure), and LPIPS (learned perceptual image patch similarity) metrics, demonstrating superior performance in both complex and simple environments. This study emphasizes the potential of our approach in advancing view synthesis techniques and provides a powerful tool for environmental research and conservation efforts in dynamic ecosystems such as mudflats. Future work will focus on further optimizations and extensions to improve the efficiency and quality of the rendering process.","PeriodicalId":50701,"journal":{"name":"Chinese Journal of Electronics","volume":"34 3","pages":"861-870"},"PeriodicalIF":1.6000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11060017","citationCount":"0","resultStr":"{\"title\":\"View Synthesis in Tidal Flat Environments with Spherical Harmonics and Neighboring Views Integration\",\"authors\":\"Huilin Ge;Zhiyu Zhu;Biao Wang;Runbang Liu;Denghao Yang;Zhiwen Qiu\",\"doi\":\"10.23919/cje.2024.00.158\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a novel view synthesis method that introduces radial field representation of density and tidal flat appearance in neural rendering. Our method aims to generate realistic images from new viewpoints by using continuous scene information generated from different sampling points along a set of identical rays. This approach significantly improves rendering quality and reduces blurring and aliasing artifacts compared to existing techniques such as Nerfacto. Our model employs the spherical harmonic function to efficiently encode viewpoint orientation information and integrates image features from neighboring viewpoints for enhanced fusion. This results in an accurate and detailed reconstruction of the scene's geometry and appearance. We evaluate our approach on publicly available datasets containing a variety of indoor and outdoor scenes, as well as on customized tidal flats datasets. The results show that our algorithm outperforms Nerfacto in terms of PSNR (peak signal-to-noise ratio), SSIM (structural similarity index measure), and LPIPS (learned perceptual image patch similarity) metrics, demonstrating superior performance in both complex and simple environments. This study emphasizes the potential of our approach in advancing view synthesis techniques and provides a powerful tool for environmental research and conservation efforts in dynamic ecosystems such as mudflats. Future work will focus on further optimizations and extensions to improve the efficiency and quality of the rendering process.\",\"PeriodicalId\":50701,\"journal\":{\"name\":\"Chinese Journal of Electronics\",\"volume\":\"34 3\",\"pages\":\"861-870\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2025-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11060017\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chinese Journal of Electronics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11060017/\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chinese Journal of Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11060017/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
View Synthesis in Tidal Flat Environments with Spherical Harmonics and Neighboring Views Integration
We present a novel view synthesis method that introduces radial field representation of density and tidal flat appearance in neural rendering. Our method aims to generate realistic images from new viewpoints by using continuous scene information generated from different sampling points along a set of identical rays. This approach significantly improves rendering quality and reduces blurring and aliasing artifacts compared to existing techniques such as Nerfacto. Our model employs the spherical harmonic function to efficiently encode viewpoint orientation information and integrates image features from neighboring viewpoints for enhanced fusion. This results in an accurate and detailed reconstruction of the scene's geometry and appearance. We evaluate our approach on publicly available datasets containing a variety of indoor and outdoor scenes, as well as on customized tidal flats datasets. The results show that our algorithm outperforms Nerfacto in terms of PSNR (peak signal-to-noise ratio), SSIM (structural similarity index measure), and LPIPS (learned perceptual image patch similarity) metrics, demonstrating superior performance in both complex and simple environments. This study emphasizes the potential of our approach in advancing view synthesis techniques and provides a powerful tool for environmental research and conservation efforts in dynamic ecosystems such as mudflats. Future work will focus on further optimizations and extensions to improve the efficiency and quality of the rendering process.
期刊介绍:
CJE focuses on the emerging fields of electronics, publishing innovative and transformative research papers. Most of the papers published in CJE are from universities and research institutes, presenting their innovative research results. Both theoretical and practical contributions are encouraged, and original research papers reporting novel solutions to the hot topics in electronics are strongly recommended.