{"title":"分析配对自我中心视频中的交互作用","authors":"A. Khatri, Zachary Butler, Ifeoma Nwogu","doi":"10.1109/FG57933.2023.10042654","DOIUrl":null,"url":null,"abstract":"As wearable devices become more popular, ego-centric information recorded with these devices can be used to better understand the behaviors of the wearer and other people the wearer is interacting with. Data such as the voice, head movement, galvanic skin responses (GSR) to measure arousal levels, etc., obtained from such devices can provide a window into the underlying affect of both the wearer and his/her conversant. In this study, we examine the characteristics of two types of dyadic conversations. In one case, the interlocutors discuss a topic on which they agree, while the other situation involves interlocutors discussing a topic on which they disagree, even if they are friends. The range of topics is mostly politically motivated. The egocentric information is collected using a pair of wearable smart glasses for video data and a smart wristband for physiological data, including GSR. Using this data, various features are extracted including the facial expressions of the conversant and the 3D motion from the wearer's camera within the environment - this motion is termed as egomotion. The goal of this work is to investigate whether the nature of a discussion could be better determined either by evaluating the behavior of an individual in the conversation or by evaluating the pairing/coupling of the behaviors of the two people in the conversation. The pairing is accomplished using a modified formulation of the dynamic time warping (DTW) algorithm. A random forest classifier is implemented to evaluate the nature of the interaction (agreement versus disagreement) using individualistic and paired features separately. The study found that in the presence of the limited data used in this work, individual behaviors were slightly more indicative of the type of discussion (85.43% accuracy) than the paired behaviors (83.33% accuracy).","PeriodicalId":318766,"journal":{"name":"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Analyzing Interactions in Paired Egocentric Videos\",\"authors\":\"A. Khatri, Zachary Butler, Ifeoma Nwogu\",\"doi\":\"10.1109/FG57933.2023.10042654\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As wearable devices become more popular, ego-centric information recorded with these devices can be used to better understand the behaviors of the wearer and other people the wearer is interacting with. Data such as the voice, head movement, galvanic skin responses (GSR) to measure arousal levels, etc., obtained from such devices can provide a window into the underlying affect of both the wearer and his/her conversant. In this study, we examine the characteristics of two types of dyadic conversations. In one case, the interlocutors discuss a topic on which they agree, while the other situation involves interlocutors discussing a topic on which they disagree, even if they are friends. The range of topics is mostly politically motivated. The egocentric information is collected using a pair of wearable smart glasses for video data and a smart wristband for physiological data, including GSR. Using this data, various features are extracted including the facial expressions of the conversant and the 3D motion from the wearer's camera within the environment - this motion is termed as egomotion. The goal of this work is to investigate whether the nature of a discussion could be better determined either by evaluating the behavior of an individual in the conversation or by evaluating the pairing/coupling of the behaviors of the two people in the conversation. The pairing is accomplished using a modified formulation of the dynamic time warping (DTW) algorithm. A random forest classifier is implemented to evaluate the nature of the interaction (agreement versus disagreement) using individualistic and paired features separately. The study found that in the presence of the limited data used in this work, individual behaviors were slightly more indicative of the type of discussion (85.43% accuracy) than the paired behaviors (83.33% accuracy).\",\"PeriodicalId\":318766,\"journal\":{\"name\":\"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)\",\"volume\":\"73 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FG57933.2023.10042654\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FG57933.2023.10042654","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Analyzing Interactions in Paired Egocentric Videos
As wearable devices become more popular, ego-centric information recorded with these devices can be used to better understand the behaviors of the wearer and other people the wearer is interacting with. Data such as the voice, head movement, galvanic skin responses (GSR) to measure arousal levels, etc., obtained from such devices can provide a window into the underlying affect of both the wearer and his/her conversant. In this study, we examine the characteristics of two types of dyadic conversations. In one case, the interlocutors discuss a topic on which they agree, while the other situation involves interlocutors discussing a topic on which they disagree, even if they are friends. The range of topics is mostly politically motivated. The egocentric information is collected using a pair of wearable smart glasses for video data and a smart wristband for physiological data, including GSR. Using this data, various features are extracted including the facial expressions of the conversant and the 3D motion from the wearer's camera within the environment - this motion is termed as egomotion. The goal of this work is to investigate whether the nature of a discussion could be better determined either by evaluating the behavior of an individual in the conversation or by evaluating the pairing/coupling of the behaviors of the two people in the conversation. The pairing is accomplished using a modified formulation of the dynamic time warping (DTW) algorithm. A random forest classifier is implemented to evaluate the nature of the interaction (agreement versus disagreement) using individualistic and paired features separately. The study found that in the presence of the limited data used in this work, individual behaviors were slightly more indicative of the type of discussion (85.43% accuracy) than the paired behaviors (83.33% accuracy).