Automated Pain Detection in Facial Videos of Children using Human-Assisted Transfer Learning.

CEUR workshop proceedings Pub Date : 2018-07-01
Xiaojing Xu, Kenneth D Craig, Damaris Diaz, Matthew S Goodwin, Murat Akcakaya, Büşra Tuğçe Susam, Jeannie S Huang, Virginia R de Sa
{"title":"Automated Pain Detection in Facial Videos of Children using Human-Assisted Transfer Learning.","authors":"Xiaojing Xu,&nbsp;Kenneth D Craig,&nbsp;Damaris Diaz,&nbsp;Matthew S Goodwin,&nbsp;Murat Akcakaya,&nbsp;Büşra Tuğçe Susam,&nbsp;Jeannie S Huang,&nbsp;Virginia R de Sa","doi":"","DOIUrl":null,"url":null,"abstract":"<p><p>Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity provides sensitive and specific information about pain, and computer vision algorithms have been developed to automatically detect Facial Action Units (AUs) defined by the Facial Action Coding System (FACS). Our prior work utilized information from computer vision, i.e., automatically detected facial AUs, to develop classifiers to distinguish between pain and no-pain conditions. However, application of pain/no-pain classifiers based on automated AU codings across different environmental domains results in diminished performance. In contrast, classifiers based on manually coded AUs demonstrate reduced environmentally-based variability in performance. In this paper, we train a machine learning model to recognize pain using AUs coded by a computer vision system embedded in a software package called iMotions. We also study the relationship between iMotions (automatically) and human (manually) coded AUs. We find that AUs coded automatically are different from those coded by a human trained in the FACS system, and that the human coder is less sensitive to environmental changes. To improve classification performance in the current work, we applied transfer learning by training another machine learning model to map automated AU codings to a subspace of manual AU codings to enable more robust pain recognition performance when only automatically coded AUs are available for the test data. With this transfer learning method, we improved the Area Under the ROC Curve (AUC) on independent data from new participants in our target domain from 0.67 to 0.72.</p>","PeriodicalId":72554,"journal":{"name":"CEUR workshop proceedings","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6352979/pdf/nihms-1001649.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CEUR workshop proceedings","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity provides sensitive and specific information about pain, and computer vision algorithms have been developed to automatically detect Facial Action Units (AUs) defined by the Facial Action Coding System (FACS). Our prior work utilized information from computer vision, i.e., automatically detected facial AUs, to develop classifiers to distinguish between pain and no-pain conditions. However, application of pain/no-pain classifiers based on automated AU codings across different environmental domains results in diminished performance. In contrast, classifiers based on manually coded AUs demonstrate reduced environmentally-based variability in performance. In this paper, we train a machine learning model to recognize pain using AUs coded by a computer vision system embedded in a software package called iMotions. We also study the relationship between iMotions (automatically) and human (manually) coded AUs. We find that AUs coded automatically are different from those coded by a human trained in the FACS system, and that the human coder is less sensitive to environmental changes. To improve classification performance in the current work, we applied transfer learning by training another machine learning model to map automated AU codings to a subspace of manual AU codings to enable more robust pain recognition performance when only automatically coded AUs are available for the test data. With this transfer learning method, we improved the Area Under the ROC Curve (AUC) on independent data from new participants in our target domain from 0.67 to 0.72.

Abstract Image

Abstract Image

Abstract Image

使用人类辅助迁移学习的儿童面部视频中的自动疼痛检测。
即使是受过培训的专业人员和家长,也很难准确确定儿童的疼痛程度。面部活动提供了关于疼痛的敏感和特定信息,并且已经开发了计算机视觉算法来自动检测面部动作编码系统(FACS)定义的面部动作单元(AU)。我们之前的工作利用了来自计算机视觉的信息,即自动检测到的面部AU,来开发分类器来区分疼痛和无疼痛情况。然而,基于自动AU编码的疼痛/无疼痛分类器在不同环境领域的应用会导致性能下降。相比之下,基于手动编码AU的分类器表现出减少了基于环境的性能可变性。在本文中,我们使用嵌入名为iMotions的软件包中的计算机视觉系统编码的AU来训练机器学习模型来识别疼痛。我们还研究了iMotions(自动)和人类(手动)编码AU之间的关系。我们发现,自动编码的AU与在FACS系统中训练的人编码的AUs不同,并且人类编码器对环境变化不太敏感。为了提高当前工作中的分类性能,我们通过训练另一个机器学习模型来应用迁移学习,将自动AU编码映射到手动AU编码的子空间,以在只有自动编码的AU可用于测试数据时实现更稳健的疼痛识别性能。通过这种迁移学习方法,我们将目标域中新参与者的独立数据的ROC曲线下面积(AUC)从0.67提高到0.72。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
1.10
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信