Natural user interface as a supplement of the holographic Raman tweezers

Z. Tomori, J. Kaňka, P. Keša, P. Jákl, M. Sery, S. Bernatová, M. Antalík, P. Zemánek
{"title":"Natural user interface as a supplement of the holographic Raman tweezers","authors":"Z. Tomori, J. Kaňka, P. Keša, P. Jákl, M. Sery, S. Bernatová, M. Antalík, P. Zemánek","doi":"10.1117/12.2061024","DOIUrl":null,"url":null,"abstract":"Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal “Natural User Interface” (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited “Leap Motion” and “MyGaze” low-cost sensors and a simple speech recognition program “Tazti”. We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called “pin” and “tweezers”) serving for the manipulation with particles are displayed on the transparent “overlay” window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work.","PeriodicalId":128143,"journal":{"name":"Optics & Photonics - NanoScience + Engineering","volume":"129 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optics & Photonics - NanoScience + Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2061024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal “Natural User Interface” (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited “Leap Motion” and “MyGaze” low-cost sensors and a simple speech recognition program “Tazti”. We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called “pin” and “tweezers”) serving for the manipulation with particles are displayed on the transparent “overlay” window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work.
自然用户界面作为全息拉曼镊子的补充
全息拉曼镊子(HRT)通过鼠标或操纵杆控制多个光学陷阱的位置来操纵微物体。最近出现了一些利用触控平板电脑、2D摄像头或Kinect游戏机的尝试。我们提出了一种多模态的“自然用户界面”(NUI)方法,该方法集成了手部跟踪、手势识别、眼动跟踪和语音识别。为此,我们利用了“Leap Motion”和“MyGaze”低成本传感器和一个简单的语音识别程序“Tazti”。我们开发了自己的NUI软件,处理来自传感器的信号并向HRT发送控制命令,HRT随后控制捕获光束的位置,微定位阶段和拉曼光谱采集系统。系统允许适合特定任务的各种操作模式。用于操纵粒子的虚拟工具(称为“大头针”和“镊子”)显示在实时摄像机图像上方的透明“覆盖”窗口上。眼动仪识别观察到的粒子的位置,并将其用于自动对焦。由主手导航的激光陷阱操作可以与副手的手势识别相结合。如果双手都很忙,语音命令识别是有用的。提出的方法提高了HRT人工控制的效率,也为HRT未来的半自动化和全自动化工作提供了良好的平台。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信