{"title":"Pose invariant affect analysis using thin-plate splines","authors":"J. McCall, M. Trivedi","doi":"10.1109/ICPR.2004.1334688","DOIUrl":null,"url":null,"abstract":"This paper introduces a method for pose-invariant facial affect analysis and a real-time system for facial affect analysis using this method. The method is centered on developing a feature vector that is more robust to rigid body movements while retaining information important to facial affect analysis. This feature vector is produced using thin-plate splines to extract affine transformations independently from nonlinear transformations quickly and efficiently. The affine portion can be used to describe the rigid body motion because planar motions in a perspective projection can be approximated by an affine transformation. Removing the affine portion and using the nonlinear portion of the thin-plate spline warping provides information on the nonlinear motion caused by facial affects. The real-time system developed using this method is composed of three main components: facial landmark tracking, feature vector extraction, and affect classification. The system processes streaming video in real-time. Testing was performed to examine the invariance to rotation as well as subject independence of the system. Finally, its application in real-world environments is discussed.","PeriodicalId":335842,"journal":{"name":"Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004.","volume":"74 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPR.2004.1334688","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21
Abstract
This paper introduces a method for pose-invariant facial affect analysis and a real-time system for facial affect analysis using this method. The method is centered on developing a feature vector that is more robust to rigid body movements while retaining information important to facial affect analysis. This feature vector is produced using thin-plate splines to extract affine transformations independently from nonlinear transformations quickly and efficiently. The affine portion can be used to describe the rigid body motion because planar motions in a perspective projection can be approximated by an affine transformation. Removing the affine portion and using the nonlinear portion of the thin-plate spline warping provides information on the nonlinear motion caused by facial affects. The real-time system developed using this method is composed of three main components: facial landmark tracking, feature vector extraction, and affect classification. The system processes streaming video in real-time. Testing was performed to examine the invariance to rotation as well as subject independence of the system. Finally, its application in real-world environments is discussed.