Integration of Eye-Tracking and Object Detection in a Deep Learning System for Quality Inspection Analysis

IF 4.8 2区 工程技术 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Seung-Wan Cho, Yeong-Hyun Lim, Kyung-Min Seo, Jungin Kim
{"title":"Integration of Eye-Tracking and Object Detection in a Deep Learning System for Quality Inspection Analysis","authors":"Seung-Wan Cho, Yeong-Hyun Lim, Kyung-Min Seo, Jungin Kim","doi":"10.1093/jcde/qwae042","DOIUrl":null,"url":null,"abstract":"\n During quality inspection in manufacturing, the gaze of a worker provides pivotal information for identifying surface defects of a product. However, it is challenging to digitize the gaze information of workers in a dynamic environment where the positions and postures of the products and workers are not fixed. A robust, deep learning-based system, ISGOD (Integrated System with worker's Gaze and Object Detection), is proposed, which analyzes data to determine which part of the object is observed by integrating object detection and eye-tracking information in dynamic environments. The ISGOD employs a 6D pose estimation algorithm for object detection, considering the location, orientation, and rotation of the object. Eye-tracking data were obtained from Tobii Glasses, which enable real-time video transmission and eye-movement tracking. A latency reduction method is proposed to overcome the time delays between object detection and eye-tracking information. Three evaluation indices, namely, gaze score, accuracy score, and concentration index are suggested for comprehensive analysis. Two experiments were conducted: a robustness test to confirm the suitability for real-time object detection and eye-tracking, and a trend test to analyze the difference in gaze movement between experts and novices. In the future, the proposed method and system can transfer the expertise of experts to enhance defect detection efficiency significantly.","PeriodicalId":48611,"journal":{"name":"Journal of Computational Design and Engineering","volume":null,"pages":null},"PeriodicalIF":4.8000,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Design and Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1093/jcde/qwae042","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

During quality inspection in manufacturing, the gaze of a worker provides pivotal information for identifying surface defects of a product. However, it is challenging to digitize the gaze information of workers in a dynamic environment where the positions and postures of the products and workers are not fixed. A robust, deep learning-based system, ISGOD (Integrated System with worker's Gaze and Object Detection), is proposed, which analyzes data to determine which part of the object is observed by integrating object detection and eye-tracking information in dynamic environments. The ISGOD employs a 6D pose estimation algorithm for object detection, considering the location, orientation, and rotation of the object. Eye-tracking data were obtained from Tobii Glasses, which enable real-time video transmission and eye-movement tracking. A latency reduction method is proposed to overcome the time delays between object detection and eye-tracking information. Three evaluation indices, namely, gaze score, accuracy score, and concentration index are suggested for comprehensive analysis. Two experiments were conducted: a robustness test to confirm the suitability for real-time object detection and eye-tracking, and a trend test to analyze the difference in gaze movement between experts and novices. In the future, the proposed method and system can transfer the expertise of experts to enhance defect detection efficiency significantly.
在深度学习系统中整合眼球跟踪和物体检测功能,用于质量检测分析
在生产质量检验过程中,工人的目光为识别产品表面缺陷提供了关键信息。然而,在产品和工人的位置和姿势都不固定的动态环境中,将工人的注视信息数字化是一项挑战。我们提出了一种基于深度学习的稳健系统--ISGOD(工人注视和物体检测集成系统),它通过整合动态环境中的物体检测和眼球跟踪信息来分析数据,以确定观察到的是物体的哪一部分。ISGOD 采用 6D 姿态估计算法进行物体检测,考虑了物体的位置、方向和旋转。眼动跟踪数据来自 Tobii 眼镜,该眼镜可实现实时视频传输和眼动跟踪。为克服物体检测和眼动跟踪信息之间的时间延迟,提出了一种减少延迟的方法。提出了三个评价指标,即注视得分、准确度得分和集中指数,用于综合分析。此外,还进行了两个实验:一个是鲁棒性测试,以确认实时物体检测和眼动跟踪的适用性;另一个是趋势测试,以分析专家和新手之间目光移动的差异。未来,所提出的方法和系统可以将专家的专业知识转移到缺陷检测中,从而显著提高缺陷检测效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Computational Design and Engineering
Journal of Computational Design and Engineering Computer Science-Human-Computer Interaction
CiteScore
7.70
自引率
20.40%
发文量
125
期刊介绍: Journal of Computational Design and Engineering is an international journal that aims to provide academia and industry with a venue for rapid publication of research papers reporting innovative computational methods and applications to achieve a major breakthrough, practical improvements, and bold new research directions within a wide range of design and engineering: • Theory and its progress in computational advancement for design and engineering • Development of computational framework to support large scale design and engineering • Interaction issues among human, designed artifacts, and systems • Knowledge-intensive technologies for intelligent and sustainable systems • Emerging technology and convergence of technology fields presented with convincing design examples • Educational issues for academia, practitioners, and future generation • Proposal on new research directions as well as survey and retrospectives on mature field.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信