Towards Scale and Position Invariant Task Classification using Normalised Visual Scanpaths in Clinical Fetal Ultrasound.

Clare Teng, Harshita Sharma, Lior Drukker, Aris T Papageorghiou, J Alison Noble
{"title":"Towards Scale and Position Invariant Task Classification using Normalised Visual Scanpaths in Clinical Fetal Ultrasound.","authors":"Clare Teng, Harshita Sharma, Lior Drukker, Aris T Papageorghiou, J Alison Noble","doi":"10.1007/978-3-030-87583-1_13","DOIUrl":null,"url":null,"abstract":"<p><p>We present a method for classifying tasks in fetal ultrasound scans using the eye-tracking data of sonographers. The visual attention of a sonographer captured by eye-tracking data over time is defined by a scanpath. In routine fetal ultrasound, the captured standard imaging planes are visually inconsistent due to fetal position, movements, and sonographer scanning experience. To address this challenge, we propose a scale and position invariant task classification method using normalised visual scanpaths. We describe a normalisation method that uses bounding boxes to provide the gaze with a reference to the position and scale of the imaging plane and use the normalised scanpath sequences to train machine learning models for discriminating between ultrasound tasks. We compare the proposed method to existing work considering raw eyetracking data. The best performing model achieves the F1-score of 84% and outperforms existing models.</p>","PeriodicalId":93620,"journal":{"name":"Simplifying medical ultrasound : second international workshop, ASMUS 2021 : held in conjunction with MICCAI 2021, Strasbourg, France, September 27, 2021 : proceedings. ASMUS (Workshop) (2nd : 2021 : Online)","volume":"10 1","pages":"129-138"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7612565/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Simplifying medical ultrasound : second international workshop, ASMUS 2021 : held in conjunction with MICCAI 2021, Strasbourg, France, September 27, 2021 : proceedings. ASMUS (Workshop) (2nd : 2021 : Online)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/978-3-030-87583-1_13","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2021/9/21 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We present a method for classifying tasks in fetal ultrasound scans using the eye-tracking data of sonographers. The visual attention of a sonographer captured by eye-tracking data over time is defined by a scanpath. In routine fetal ultrasound, the captured standard imaging planes are visually inconsistent due to fetal position, movements, and sonographer scanning experience. To address this challenge, we propose a scale and position invariant task classification method using normalised visual scanpaths. We describe a normalisation method that uses bounding boxes to provide the gaze with a reference to the position and scale of the imaging plane and use the normalised scanpath sequences to train machine learning models for discriminating between ultrasound tasks. We compare the proposed method to existing work considering raw eyetracking data. The best performing model achieves the F1-score of 84% and outperforms existing models.

应用归一化视觉扫描路径在临床胎儿超声中实现尺度和位置不变任务分类。
我们提出了一种方法分类任务在胎儿超声扫描中使用超声医师的眼动追踪数据。超声医师通过眼动追踪数据捕捉到的视觉注意力通过扫描路径来定义。在常规的胎儿超声中,由于胎儿的位置、运动和超声医师的扫描经验,捕获的标准成像平面在视觉上不一致。为了解决这一挑战,我们提出了一种使用归一化视觉扫描路径的尺度和位置不变任务分类方法。我们描述了一种归一化方法,该方法使用边界框为凝视提供成像平面位置和尺度的参考,并使用归一化扫描路径序列来训练机器学习模型,以区分超声波任务。我们将提出的方法与考虑原始眼动追踪数据的现有工作进行比较。表现最好的模型达到了f1分数的84%,优于现有的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信