%0 Thesis %T Human Emotion Evaluation on Facial Image Sequences %A Francisco Javier Orozco %E Jordi Gonzalez %E Xavier Roca %D 2010 %I Ediciones Graficas Rey %@ 978-84-936529-3-7 %F Francisco Javier Orozco2010 %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=1335), last updated on Fri, 17 Dec 2021 14:04:53 +0100 %X Psychological evidence has emphasized the importance of affective behaviour understanding due to its high impact in nowadays interaction humans and computers. Alltype of affective and behavioural patterns such as gestures, emotions and mentalstates are highly displayed through the face, head and body. Therefore, this thesis isfocused to analyse affective behaviours on head and face. To this end, head and facialmovements are encoded by using appearance based tracking methods. Specifically,a wise combination of deformable models captures rigid and non-rigid movements ofdifferent kinematics; 3D head pose, eyebrows, mouth, eyelids and irises are taken intoaccount as basis for extracting features from databases of video sequences. This approach combines the strengths of adaptive appearance models, optimization methodsand backtracking techniques.For about thirty years, computer sciences have addressed the investigation onhuman emotions to the automatic recognition of six prototypic emotions suggestedby Darwin and systematized by Paul Ekman in the seventies. The Facial ActionCoding System (FACS) which uses discrete movements of the face (called Actionunits or AUs) to code the six facial emotions named anger, disgust, fear, happy-Joy,sadness and surprise. However, human emotions are much complex patterns thathave not received the same attention from computer scientists.Simon Baron-Cohen proposed a new taxonomy of emotions and mental stateswithout a system coding of the facial actions. These 426 affective behaviours aremore challenging for the understanding of human emotions. Beyond of classicallyclassifying the six basic facial expressions, more subtle gestures, facial actions andspontaneous emotions are considered here. By assessing confidence on the recognitionresults, exploring spatial and temporal relationships of the features, some methods arecombined and enhanced for developing new taxonomy of expressions and emotions.The objective of this dissertation is to develop a computer vision system, including both facial feature extraction, expression recognition and emotion understandingby building a bottom-up reasoning process. Building a detailed taxonomy of humanaffective behaviours is an interesting challenge for head-face-based image analysismethods. In this paper, we exploit the strengths of Canonical Correlation Analysis(CCA) to enhance an on-line head-face tracker. A relationship between head pose andlocal facial movements is studied according to their cognitive interpretation on affective expressions and emotions. Active Shape Models are synthesized for AAMs basedon CCA-regression. Head pose and facial actions are fused into a maximally correlated space in order to assess expressiveness, confidence and classification in a CBR system. The CBR solutions are also correlated to the cognitive features, which allowavoiding exhaustive search when recognizing new head-face features. Subsequently,Support Vector Machines (SVMs) and Bayesian Networks are applied for learning thespatial relationships of facial expressions. Similarly, the temporal evolution of facialexpressions, emotion and mental states are analysed based on Factorized DynamicBayesian Networks (FaDBN).As results, the bottom-up system recognizes six facial expressions, six basic emotions and six mental states, plus enhancing this categorization with confidence assessment at each level, intensity of expressions and a complete taxonomy %9 theses %9 Ph.D. thesis