用于光学干涉测量的注意引导深度神经网络二维相位展开

IF 4.6 2区 物理与天体物理 Q1 OPTICS
Youxing Li , Lingzhi Meng , Donghui Wang , Jiahao Zhang , Libo Yuan
{"title":"用于光学干涉测量的注意引导深度神经网络二维相位展开","authors":"Youxing Li ,&nbsp;Lingzhi Meng ,&nbsp;Donghui Wang ,&nbsp;Jiahao Zhang ,&nbsp;Libo Yuan","doi":"10.1016/j.optlastec.2025.113358","DOIUrl":null,"url":null,"abstract":"<div><div>Two-dimensional phase unwrapping is an essential process in interferometry applications since the unwrapped phase plays a guiding role in the morphological reconstruction of the object. However, the unwrapped phase at each position is affected by the wrap states for other positions. Therefore, it is essential to obtain global context information for better phase unwrapping. To this purpose, we propose a novel attention-guided deep neural network that introduces the self-attention mechanism, a popular deep learning technique, to tackle the problem in a highly efficient way. We design a dual attention phase unwrapping network (DAPUN), which uses two kinds of complementary self-attention structures to obtain global context information (such as wrapped states), allowing each positional feature to be directly compared with ones at any other positions. To evaluate the effectiveness of the proposed DAPUN, we simulate extensive data to train the networks and compare it with several existing phase unwrapping methods. The results show that DAPUN significantly outperforms the previous state-of-the-art. After that, we apply the trained DAPUN to the real cases to demonstrate its generalization capability.</div></div>","PeriodicalId":19511,"journal":{"name":"Optics and Laser Technology","volume":"191 ","pages":"Article 113358"},"PeriodicalIF":4.6000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"2D phase unwrapping by attention-guided deep neural network for optical interferometry\",\"authors\":\"Youxing Li ,&nbsp;Lingzhi Meng ,&nbsp;Donghui Wang ,&nbsp;Jiahao Zhang ,&nbsp;Libo Yuan\",\"doi\":\"10.1016/j.optlastec.2025.113358\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Two-dimensional phase unwrapping is an essential process in interferometry applications since the unwrapped phase plays a guiding role in the morphological reconstruction of the object. However, the unwrapped phase at each position is affected by the wrap states for other positions. Therefore, it is essential to obtain global context information for better phase unwrapping. To this purpose, we propose a novel attention-guided deep neural network that introduces the self-attention mechanism, a popular deep learning technique, to tackle the problem in a highly efficient way. We design a dual attention phase unwrapping network (DAPUN), which uses two kinds of complementary self-attention structures to obtain global context information (such as wrapped states), allowing each positional feature to be directly compared with ones at any other positions. To evaluate the effectiveness of the proposed DAPUN, we simulate extensive data to train the networks and compare it with several existing phase unwrapping methods. The results show that DAPUN significantly outperforms the previous state-of-the-art. After that, we apply the trained DAPUN to the real cases to demonstrate its generalization capability.</div></div>\",\"PeriodicalId\":19511,\"journal\":{\"name\":\"Optics and Laser Technology\",\"volume\":\"191 \",\"pages\":\"Article 113358\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optics and Laser Technology\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0030399225009491\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"OPTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optics and Laser Technology","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0030399225009491","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0

摘要

二维相位解包裹是干涉测量应用中的一个重要过程,因为解包裹相位在物体的形态重建中起着指导作用。但是,每个位置的解包裹阶段受到其他位置的包裹状态的影响。因此,为了更好地展开阶段,获取全局上下文信息至关重要。为此,我们提出了一种新的注意力引导深度神经网络,该网络引入了一种流行的深度学习技术——自注意机制,以一种高效的方式解决了这个问题。我们设计了一个双注意相位展开网络(DAPUN),它使用两种互补的自注意结构来获取全局上下文信息(如包裹状态),允许每个位置特征与任何其他位置的特征直接进行比较。为了评估所提出的DAPUN的有效性,我们模拟了大量数据来训练网络,并将其与几种现有的相位展开方法进行了比较。结果表明,DAPUN显著优于以前的先进技术。然后,将训练好的DAPUN应用于实际案例,验证其泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
2D phase unwrapping by attention-guided deep neural network for optical interferometry
Two-dimensional phase unwrapping is an essential process in interferometry applications since the unwrapped phase plays a guiding role in the morphological reconstruction of the object. However, the unwrapped phase at each position is affected by the wrap states for other positions. Therefore, it is essential to obtain global context information for better phase unwrapping. To this purpose, we propose a novel attention-guided deep neural network that introduces the self-attention mechanism, a popular deep learning technique, to tackle the problem in a highly efficient way. We design a dual attention phase unwrapping network (DAPUN), which uses two kinds of complementary self-attention structures to obtain global context information (such as wrapped states), allowing each positional feature to be directly compared with ones at any other positions. To evaluate the effectiveness of the proposed DAPUN, we simulate extensive data to train the networks and compare it with several existing phase unwrapping methods. The results show that DAPUN significantly outperforms the previous state-of-the-art. After that, we apply the trained DAPUN to the real cases to demonstrate its generalization capability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
8.50
自引率
10.00%
发文量
1060
审稿时长
3.4 months
期刊介绍: Optics & Laser Technology aims to provide a vehicle for the publication of a broad range of high quality research and review papers in those fields of scientific and engineering research appertaining to the development and application of the technology of optics and lasers. Papers describing original work in these areas are submitted to rigorous refereeing prior to acceptance for publication. The scope of Optics & Laser Technology encompasses, but is not restricted to, the following areas: •development in all types of lasers •developments in optoelectronic devices and photonics •developments in new photonics and optical concepts •developments in conventional optics, optical instruments and components •techniques of optical metrology, including interferometry and optical fibre sensors •LIDAR and other non-contact optical measurement techniques, including optical methods in heat and fluid flow •applications of lasers to materials processing, optical NDT display (including holography) and optical communication •research and development in the field of laser safety including studies of hazards resulting from the applications of lasers (laser safety, hazards of laser fume) •developments in optical computing and optical information processing •developments in new optical materials •developments in new optical characterization methods and techniques •developments in quantum optics •developments in light assisted micro and nanofabrication methods and techniques •developments in nanophotonics and biophotonics •developments in imaging processing and systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信