Min Li , Luefeng Chen , Min Wu , Kaoru Hirota , Witold Pedrycz
{"title":"Broad-deep network-based fuzzy emotional inference model with personal information for intention understanding in human–robot interaction","authors":"Min Li , Luefeng Chen , Min Wu , Kaoru Hirota , Witold Pedrycz","doi":"10.1016/j.arcontrol.2024.100951","DOIUrl":null,"url":null,"abstract":"<div><p>A broad-deep fusion network-based fuzzy emotional inference model with personal information (BDFEI) is proposed for emotional intention understanding in human–robot interaction. It aims to understand students’ intentions in the university teaching scene. Initially, we employ convolution and maximum pooling for feature extraction. Subsequently, we apply the ridge regression algorithm for emotional behavior recognition, which effectively mitigates the impact of complex network structures and slow network updates often associated with deep learning. Moreover, we utilize multivariate analysis of variance to identify the key personal information factors influencing intentions and calculate their influence coefficients. Finally, a fuzzy inference method is employed to gain a comprehensive understanding of intentions. Our experimental results demonstrate the effectiveness of the BDFEI model. When compared to existing models, namely FDNNSA, ResNet-101+GFK, and HCFS, the BDFEI model achieved superior accuracy on the FABO database, surpassing them by 12.21%, 1.89%, and 0.78%, respectively. Furthermore, our self-built database experiments yielded an impressive 82.00% accuracy in intention understanding, confirming the efficacy of our emotional intention inference model.</p></div>","PeriodicalId":50750,"journal":{"name":"Annual Reviews in Control","volume":"57 ","pages":"Article 100951"},"PeriodicalIF":7.3000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annual Reviews in Control","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1367578824000208","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
A broad-deep fusion network-based fuzzy emotional inference model with personal information (BDFEI) is proposed for emotional intention understanding in human–robot interaction. It aims to understand students’ intentions in the university teaching scene. Initially, we employ convolution and maximum pooling for feature extraction. Subsequently, we apply the ridge regression algorithm for emotional behavior recognition, which effectively mitigates the impact of complex network structures and slow network updates often associated with deep learning. Moreover, we utilize multivariate analysis of variance to identify the key personal information factors influencing intentions and calculate their influence coefficients. Finally, a fuzzy inference method is employed to gain a comprehensive understanding of intentions. Our experimental results demonstrate the effectiveness of the BDFEI model. When compared to existing models, namely FDNNSA, ResNet-101+GFK, and HCFS, the BDFEI model achieved superior accuracy on the FABO database, surpassing them by 12.21%, 1.89%, and 0.78%, respectively. Furthermore, our self-built database experiments yielded an impressive 82.00% accuracy in intention understanding, confirming the efficacy of our emotional intention inference model.
期刊介绍:
The field of Control is changing very fast now with technology-driven “societal grand challenges” and with the deployment of new digital technologies. The aim of Annual Reviews in Control is to provide comprehensive and visionary views of the field of Control, by publishing the following types of review articles:
Survey Article: Review papers on main methodologies or technical advances adding considerable technical value to the state of the art. Note that papers which purely rely on mechanistic searches and lack comprehensive analysis providing a clear contribution to the field will be rejected.
Vision Article: Cutting-edge and emerging topics with visionary perspective on the future of the field or how it will bridge multiple disciplines, and
Tutorial research Article: Fundamental guides for future studies.