基于深度学习的多模拉盖尔-高斯光束双输出模式分析

3区 物理与天体物理 Q1 Materials Science
Xudong Yuan, Yaguang Xu, Ruizhi Zhao, Xuhao Hong, Ronger Lu, Xia Feng, Yongchuang Chen, Ji Zou, Chao Zhang, Yiqiang Qin, Yong Zhu
{"title":"基于深度学习的多模拉盖尔-高斯光束双输出模式分析","authors":"Xudong Yuan, Yaguang Xu, Ruizhi Zhao, Xuhao Hong, Ronger Lu, Xia Feng, Yongchuang Chen, Ji Zou, Chao Zhang, Yiqiang Qin, Yong Zhu","doi":"10.3390/OPT2020009","DOIUrl":null,"url":null,"abstract":"The Laguerre-Gaussian (LG) beam demonstrates great potential for optical communication due to its orthogonality between different eigenstates, and has gained increased research interest in recent years. Here, we propose a dual-output mode analysis method based on deep learning that can accurately obtain both the mode weight and phase information of multimode LG beams. We reconstruct the LG beams based on the result predicted by the convolutional neural network. It shows that the correlation coefficient values after reconstruction are above 0.9999, and the mean absolute error (MAE) of the mode weights and phases are about 1.4 × 10−3 and 2.9 × 10−3, respectively. The model still maintains relatively accurate prediction for the associated unknown data set and the noise-disturbed samples. In addition, the computation time of the model for a single test sample takes only 0.975 ms on average. These results show that our method has good abilities of generalization and robustness and allows for nearly real-time modal analysis.","PeriodicalId":54548,"journal":{"name":"Progress in Optics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Dual-Output Mode Analysis of Multimode Laguerre-Gaussian Beams via Deep Learning\",\"authors\":\"Xudong Yuan, Yaguang Xu, Ruizhi Zhao, Xuhao Hong, Ronger Lu, Xia Feng, Yongchuang Chen, Ji Zou, Chao Zhang, Yiqiang Qin, Yong Zhu\",\"doi\":\"10.3390/OPT2020009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Laguerre-Gaussian (LG) beam demonstrates great potential for optical communication due to its orthogonality between different eigenstates, and has gained increased research interest in recent years. Here, we propose a dual-output mode analysis method based on deep learning that can accurately obtain both the mode weight and phase information of multimode LG beams. We reconstruct the LG beams based on the result predicted by the convolutional neural network. It shows that the correlation coefficient values after reconstruction are above 0.9999, and the mean absolute error (MAE) of the mode weights and phases are about 1.4 × 10−3 and 2.9 × 10−3, respectively. The model still maintains relatively accurate prediction for the associated unknown data set and the noise-disturbed samples. In addition, the computation time of the model for a single test sample takes only 0.975 ms on average. These results show that our method has good abilities of generalization and robustness and allows for nearly real-time modal analysis.\",\"PeriodicalId\":54548,\"journal\":{\"name\":\"Progress in Optics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-05-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Progress in Optics\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://doi.org/10.3390/OPT2020009\",\"RegionNum\":3,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Materials Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Progress in Optics","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/OPT2020009","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Materials Science","Score":null,"Total":0}
引用次数: 2

摘要

拉盖尔-高斯光束由于其不同本征态之间的正交性,在光通信中显示出巨大的潜力,近年来得到了越来越多的研究兴趣。在此,我们提出了一种基于深度学习的双输出模式分析方法,可以准确地获得多模LG光束的模式权重和相位信息。我们在卷积神经网络预测结果的基础上重建了LG光束。结果表明,重构后的相关系数值均在0.9999以上,模态权值和相位的平均绝对误差(MAE)分别约为1.4 × 10−3和2.9 × 10−3。对于相关的未知数据集和受噪声干扰的样本,该模型仍然保持相对准确的预测。此外,该模型对单个测试样本的计算时间平均仅为0.975 ms。结果表明,该方法具有良好的泛化能力和鲁棒性,可以实现近乎实时的模态分析。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Dual-Output Mode Analysis of Multimode Laguerre-Gaussian Beams via Deep Learning
The Laguerre-Gaussian (LG) beam demonstrates great potential for optical communication due to its orthogonality between different eigenstates, and has gained increased research interest in recent years. Here, we propose a dual-output mode analysis method based on deep learning that can accurately obtain both the mode weight and phase information of multimode LG beams. We reconstruct the LG beams based on the result predicted by the convolutional neural network. It shows that the correlation coefficient values after reconstruction are above 0.9999, and the mean absolute error (MAE) of the mode weights and phases are about 1.4 × 10−3 and 2.9 × 10−3, respectively. The model still maintains relatively accurate prediction for the associated unknown data set and the noise-disturbed samples. In addition, the computation time of the model for a single test sample takes only 0.975 ms on average. These results show that our method has good abilities of generalization and robustness and allows for nearly real-time modal analysis.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Progress in Optics
Progress in Optics 物理-光学
CiteScore
4.50
自引率
0.00%
发文量
8
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信