Enhancing dynamic target reconstruction and tracking based on ghost imaging and deep convolutional neural networks

IF 2.2 3区 物理与天体物理 Q2 OPTICS
{"title":"Enhancing dynamic target reconstruction and tracking based on ghost imaging and deep convolutional neural networks","authors":"","doi":"10.1016/j.optcom.2024.131224","DOIUrl":null,"url":null,"abstract":"<div><div>Ghost imaging requires a large amount of sampling data, which limits its applications in the study of dynamic objects. Here, we propose an imaging technique based on deep convolutional neural networks (SaDunet) that can be used to examine the dynamics of target objects. By replacing the traditional correlation imaging reconstruction approach with SaDunet, the ability to recover high-quality images at low sampling rates is enhanced. The motion process of the target object is decomposed into multiple motion frames, and then each frame is imaged separately. Experiments show that the reconstructed image of the target object obtained by this scheme is of high quality, contains almost no noise, and accurately reflects the motion behavior of the target object.</div></div>","PeriodicalId":19586,"journal":{"name":"Optics Communications","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optics Communications","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0030401824009611","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Ghost imaging requires a large amount of sampling data, which limits its applications in the study of dynamic objects. Here, we propose an imaging technique based on deep convolutional neural networks (SaDunet) that can be used to examine the dynamics of target objects. By replacing the traditional correlation imaging reconstruction approach with SaDunet, the ability to recover high-quality images at low sampling rates is enhanced. The motion process of the target object is decomposed into multiple motion frames, and then each frame is imaged separately. Experiments show that the reconstructed image of the target object obtained by this scheme is of high quality, contains almost no noise, and accurately reflects the motion behavior of the target object.
基于鬼影成像和深度卷积神经网络加强动态目标重建和跟踪
幽灵成像需要大量采样数据,这限制了它在动态物体研究中的应用。在此,我们提出了一种基于深度卷积神经网络(SaDunet)的成像技术,可用于研究目标物体的动态。用 SaDunet 代替传统的相关成像重建方法,提高了在低采样率下恢复高质量图像的能力。目标物体的运动过程被分解成多个运动帧,然后对每个帧分别成像。实验表明,该方案获得的目标物体重建图像质量高,几乎不含噪声,能准确反映目标物体的运动行为。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Optics Communications
Optics Communications 物理-光学
CiteScore
5.10
自引率
8.30%
发文量
681
审稿时长
38 days
期刊介绍: Optics Communications invites original and timely contributions containing new results in various fields of optics and photonics. The journal considers theoretical and experimental research in areas ranging from the fundamental properties of light to technological applications. Topics covered include classical and quantum optics, optical physics and light-matter interactions, lasers, imaging, guided-wave optics and optical information processing. Manuscripts should offer clear evidence of novelty and significance. Papers concentrating on mathematical and computational issues, with limited connection to optics, are not suitable for publication in the Journal. Similarly, small technical advances, or papers concerned only with engineering applications or issues of materials science fall outside the journal scope.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信