%0 Journal Article %T Head-gestures mirroring detection in dyadic social linteractions with computer vision-based wearable devices %A Juan Ramon Terven Salinas %A Bogdan Raducanu %A Maria Elena Meza de Luna %A Joaquin Salas %J Neurocomputing %D 2016 %V 175 %N B %F Juan Ramon Terven Salinas2016 %O OR; 600.072; 600.068;MV %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=2721), last updated on Fri, 18 Nov 2016 12:35:35 +0100 %X During face-to-face human interaction, nonverbal communication plays a fundamental role. A relevant aspect that takes part during social interactions is represented by mirroring, in which a person tends to mimic the non-verbal behavior (head and body gestures, vocal prosody, etc.) of the counterpart. In this paper, we introduce a computer vision-based system to detect mirroring in dyadic social interactions with the use of a wearable platform. In our context, mirroring is inferred as simultaneous head noddings displayed by the interlocutors. Our approach consists of the following steps: (1) facial features extraction; (2) facial features stabilization; (3) head nodding recognition; and (4) mirroring detection. Our system achieves a mirroring detection accuracy of 72% on a custom mirroring dataset. %K Head gestures recognition %K Mirroring detection %K Dyadic social interaction analysis %K Wearable devices %U http://refbase.cvc.uab.es/files/TRM2016.pdf %U http://dx.doi.org/10.1016/j.neucom.2015.05.131 %P 866–876