基于深度学习的眼睛注视控制机器人汽车

Dipayan Saha, Munia Ferdoushi, Md. Tanvir Emrose, Subrata Das, S. Hasan, Asir Intisar Khan, C. Shahnaz
{"title":"基于深度学习的眼睛注视控制机器人汽车","authors":"Dipayan Saha, Munia Ferdoushi, Md. Tanvir Emrose, Subrata Das, S. Hasan, Asir Intisar Khan, C. Shahnaz","doi":"10.1109/R10-HTC.2018.8629836","DOIUrl":null,"url":null,"abstract":"In recent years Eye gaze tracking (EGT) has emerged as an attractive alternative to conventional communication modes. Gaze estimation can be effectively used in human-computer interaction, assistive devices for motor-disabled persons, autonomous robot control systems, safe car driving, diagnosis of diseases and even in human sentiment assessment. Implementation in any of these areas however mostly depends on the efficiency of detection algorithm along with usability and robustness of detection process. In this context we have proposed a Convolutional Neural Network (CNN) architecture to estimate the eye gaze direction from detected eyes which outperforms all other state of the art results for Eye-Chimera dataset. The overall accuracies are 90.21% and 99.19% for Eye-Chimera and HPEG datasets respectively. This paper also introduces a new dataset EGDC for which proposed algorithm finds 86.93% accuracy. We have developed a real-time eye gaze controlled robotic car as a prototype for possible implementations of our algorithm.","PeriodicalId":404432,"journal":{"name":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Deep Learning-Based Eye Gaze Controlled Robotic Car\",\"authors\":\"Dipayan Saha, Munia Ferdoushi, Md. Tanvir Emrose, Subrata Das, S. Hasan, Asir Intisar Khan, C. Shahnaz\",\"doi\":\"10.1109/R10-HTC.2018.8629836\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years Eye gaze tracking (EGT) has emerged as an attractive alternative to conventional communication modes. Gaze estimation can be effectively used in human-computer interaction, assistive devices for motor-disabled persons, autonomous robot control systems, safe car driving, diagnosis of diseases and even in human sentiment assessment. Implementation in any of these areas however mostly depends on the efficiency of detection algorithm along with usability and robustness of detection process. In this context we have proposed a Convolutional Neural Network (CNN) architecture to estimate the eye gaze direction from detected eyes which outperforms all other state of the art results for Eye-Chimera dataset. The overall accuracies are 90.21% and 99.19% for Eye-Chimera and HPEG datasets respectively. This paper also introduces a new dataset EGDC for which proposed algorithm finds 86.93% accuracy. We have developed a real-time eye gaze controlled robotic car as a prototype for possible implementations of our algorithm.\",\"PeriodicalId\":404432,\"journal\":{\"name\":\"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)\",\"volume\":\"107 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/R10-HTC.2018.8629836\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE Region 10 Humanitarian Technology Conference (R10-HTC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/R10-HTC.2018.8629836","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

近年来,眼注视追踪(EGT)已成为传统通信模式的一种有吸引力的替代方案。注视估计可以有效地应用于人机交互、运动残疾人辅助设备、机器人自主控制系统、汽车安全驾驶、疾病诊断甚至人类情感评估等领域。然而,这些领域的实现主要取决于检测算法的效率以及检测过程的可用性和鲁棒性。在此背景下,我们提出了一种卷积神经网络(CNN)架构来从检测到的眼睛中估计眼睛的凝视方向,该架构优于所有其他eye - chimera数据集的最新结果。Eye-Chimera和HPEG数据集的总体精度分别为90.21%和99.19%。本文还介绍了一个新的数据集EGDC,该算法的准确率为86.93%。我们已经开发了一款实时眼睛注视控制的机器人汽车,作为我们算法可能实现的原型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Deep Learning-Based Eye Gaze Controlled Robotic Car
In recent years Eye gaze tracking (EGT) has emerged as an attractive alternative to conventional communication modes. Gaze estimation can be effectively used in human-computer interaction, assistive devices for motor-disabled persons, autonomous robot control systems, safe car driving, diagnosis of diseases and even in human sentiment assessment. Implementation in any of these areas however mostly depends on the efficiency of detection algorithm along with usability and robustness of detection process. In this context we have proposed a Convolutional Neural Network (CNN) architecture to estimate the eye gaze direction from detected eyes which outperforms all other state of the art results for Eye-Chimera dataset. The overall accuracies are 90.21% and 99.19% for Eye-Chimera and HPEG datasets respectively. This paper also introduces a new dataset EGDC for which proposed algorithm finds 86.93% accuracy. We have developed a real-time eye gaze controlled robotic car as a prototype for possible implementations of our algorithm.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信