Relative Position Estimation Using Cross Filter for Rolling-Shutter Cameras

IF 2.4 4区 工程技术 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC
Daisuke Sekimoto;Koji Kamakura;Masayuki Kinoshita;Takaya Yamazato
{"title":"Relative Position Estimation Using Cross Filter for Rolling-Shutter Cameras","authors":"Daisuke Sekimoto;Koji Kamakura;Masayuki Kinoshita;Takaya Yamazato","doi":"10.1109/JPHOT.2025.3608630","DOIUrl":null,"url":null,"abstract":"We propose an estimation method of the relative position of a light emitting diode (LED) from a rolling-shutter (RS) camera with a cross filter (CF). The CF is set in front of the camera lens so that incident light can be scattered into two straight lines extending about <inline-formula><tex-math>$\\pm \\pi /4$</tex-math></inline-formula> from the horizontal row of the CMOS RS image sensor (IS). With the geometry of a right-angled isosceles triangle, our method makes it possible to estimate the coordinates of the LED in the IS plane when the scattered light is captured at two points in a scanline. With the CF, our method continues to update the estimated coordinate of the LED as long as the row captures scattered light at two points. Naturally, our method makes it possible row by row when multiple LEDs are used or when they are moving. In contrast, without the CF, estimation is not possible until the scanline reaches the row which captures the whole LED, and the estimated coordinates are not updated once it is done for that frame. Experiments verify that our estimation method works with the estimation error less than five pixels, and extends the region of a frame in which estimated coordinates are output, even in a moving environment, while the conventional method (without CF) does the estimated coordinates only once per frame.","PeriodicalId":13204,"journal":{"name":"IEEE Photonics Journal","volume":"17 5","pages":"1-9"},"PeriodicalIF":2.4000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11155154","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Photonics Journal","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/11155154/","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

We propose an estimation method of the relative position of a light emitting diode (LED) from a rolling-shutter (RS) camera with a cross filter (CF). The CF is set in front of the camera lens so that incident light can be scattered into two straight lines extending about $\pm \pi /4$ from the horizontal row of the CMOS RS image sensor (IS). With the geometry of a right-angled isosceles triangle, our method makes it possible to estimate the coordinates of the LED in the IS plane when the scattered light is captured at two points in a scanline. With the CF, our method continues to update the estimated coordinate of the LED as long as the row captures scattered light at two points. Naturally, our method makes it possible row by row when multiple LEDs are used or when they are moving. In contrast, without the CF, estimation is not possible until the scanline reaches the row which captures the whole LED, and the estimated coordinates are not updated once it is done for that frame. Experiments verify that our estimation method works with the estimation error less than five pixels, and extends the region of a frame in which estimated coordinates are output, even in a moving environment, while the conventional method (without CF) does the estimated coordinates only once per frame.
基于交叉滤波的滚动快门相机相对位置估计
我们提出了一种利用交叉滤光片(CF)估算滚动快门(RS)相机中发光二极管(LED)相对位置的方法。CF设置在相机镜头的前面,这样入射光可以从CMOS RS图像传感器(is)的水平行散射成两条直线,延伸约$\pm \pi /4$。利用直角等腰三角形的几何形状,当散射光在扫描线的两点上被捕获时,我们的方法使得估计LED在IS平面上的坐标成为可能。使用CF,我们的方法继续更新LED的估计坐标,只要行捕获两点的散射光。当然,当多个led被使用或移动时,我们的方法可以逐行实现。相反,如果没有CF,在扫描线到达捕获整个LED的行之前,估计是不可能的,并且一旦完成该帧的估计坐标就不会更新。实验证明,该估计方法的估计误差小于5个像素,并且即使在运动环境中也可以扩展输出估计坐标的帧区域,而传统方法(不含CF)每帧只输出一次估计坐标。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Photonics Journal
IEEE Photonics Journal ENGINEERING, ELECTRICAL & ELECTRONIC-OPTICS
CiteScore
4.50
自引率
8.30%
发文量
489
审稿时长
1.4 months
期刊介绍: Breakthroughs in the generation of light and in its control and utilization have given rise to the field of Photonics, a rapidly expanding area of science and technology with major technological and economic impact. Photonics integrates quantum electronics and optics to accelerate progress in the generation of novel photon sources and in their utilization in emerging applications at the micro and nano scales spanning from the far-infrared/THz to the x-ray region of the electromagnetic spectrum. IEEE Photonics Journal is an online-only journal dedicated to the rapid disclosure of top-quality peer-reviewed research at the forefront of all areas of photonics. Contributions addressing issues ranging from fundamental understanding to emerging technologies and applications are within the scope of the Journal. The Journal includes topics in: Photon sources from far infrared to X-rays, Photonics materials and engineered photonic structures, Integrated optics and optoelectronic, Ultrafast, attosecond, high field and short wavelength photonics, Biophotonics, including DNA photonics, Nanophotonics, Magnetophotonics, Fundamentals of light propagation and interaction; nonlinear effects, Optical data storage, Fiber optics and optical communications devices, systems, and technologies, Micro Opto Electro Mechanical Systems (MOEMS), Microwave photonics, Optical Sensors.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信