Fast Registration Method for Large-Field-Of-View Nailfold Video Images Based on Improved Projection Analysis.

IF 2.3
Peiqing Guo, Hao Yin, Yanxiong Wu, Bin Zhou, Jiaxiong Luo, Qianyao Ye, Shou Feng, Qirui Sun, Hongjun Zhou, Fanxin Zeng
{"title":"Fast Registration Method for Large-Field-Of-View Nailfold Video Images Based on Improved Projection Analysis.","authors":"Peiqing Guo, Hao Yin, Yanxiong Wu, Bin Zhou, Jiaxiong Luo, Qianyao Ye, Shou Feng, Qirui Sun, Hongjun Zhou, Fanxin Zeng","doi":"10.1002/jbio.70052","DOIUrl":null,"url":null,"abstract":"<p><p>In nailfold video recordings, the micro-shaking of the hand is amplified and interferes with physician observations and parameter measurement. We developed a fast and accurate registration method for large-field-of-view nailfold video images. Nailfold videos are first represented in the YCrCb color space, with the Cb spatial component replacing the original grayscale image to reduce sensitivity to illumination. The projection variance of each row/column is employed to improve registration accuracy and processing speed. The method was compared with Origin GrayDrop, feature point matching, unsupervised learning, and Adobe Premiere Pro in terms of the peak signal-to-noise ratio, structural similarity index, and mean squared error. The peak signal-to-noise ratio and structural similarity index are enhanced, and the mean squared error is reduced compared to the original projection method. Moreover, the proposed method is faster than the comparison methods and provides the best combination of registration accuracy and fast processing for nailfold video image registration.</p>","PeriodicalId":94068,"journal":{"name":"Journal of biophotonics","volume":" ","pages":"e70052"},"PeriodicalIF":2.3000,"publicationDate":"2025-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of biophotonics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/jbio.70052","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In nailfold video recordings, the micro-shaking of the hand is amplified and interferes with physician observations and parameter measurement. We developed a fast and accurate registration method for large-field-of-view nailfold video images. Nailfold videos are first represented in the YCrCb color space, with the Cb spatial component replacing the original grayscale image to reduce sensitivity to illumination. The projection variance of each row/column is employed to improve registration accuracy and processing speed. The method was compared with Origin GrayDrop, feature point matching, unsupervised learning, and Adobe Premiere Pro in terms of the peak signal-to-noise ratio, structural similarity index, and mean squared error. The peak signal-to-noise ratio and structural similarity index are enhanced, and the mean squared error is reduced compared to the original projection method. Moreover, the proposed method is faster than the comparison methods and provides the best combination of registration accuracy and fast processing for nailfold video image registration.

基于改进投影分析的大视场折甲视频图像快速配准方法。
在甲襞视频记录中,手的微抖动被放大,干扰了医生的观察和参数测量。针对大视场甲襞视频图像,提出了一种快速准确的配准方法。甲襞视频首先在YCrCb色彩空间中表示,用Cb空间分量代替原始灰度图像,以降低对光照的敏感性。利用行/列的投影方差提高配准精度和处理速度。将该方法与Origin GrayDrop、特征点匹配、无监督学习和Adobe Premiere Pro进行峰值信噪比、结构相似度指数和均方误差的比较。提高了峰值信噪比和结构相似度,减小了均方误差。此外,该方法比对比方法更快,为甲襞视频图像配准提供了配准精度和快速处理的最佳结合。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信