Time to Smile: How Onset Asynchronies Between Reciprocal Facial Expressions Influence the Experience of Responsiveness of a Virtual Agent

Abstract

Human social interactions are highly coordinated in time and involve the reciprocal exchange of facial emotional expressions. While timing has been identified as an important factor in social communication via gaze or speech, so far there has been no systematic investigation on how the temporal delays in the exchange of facial emotional expressions between interactive partners influence the experience of responsiveness. In the present study, 40 participants were cued to direct a facial emotional expression (angry or happy) towards a virtual agent in front of them and the virtual agent would then react with a congruent facial expression. The temporal delay between the cue and the reaction was manipulated as an independent variable. Exact delays between facial expressions were determined based on the onset latencies of participants’ facial expressions measured via facial EMG ( M. Zygomaticus and M. Corrugator) . Ratings of experienced responsiveness of the virtual agent were collected as a dependent measure. Using a linear mixed effect model in a growth curve analysis revealed that experienced responsiveness peaked at delays around 700 ms. Moreover, experienced responsiveness at shorter temporal delays was higher when happy versus angry facial expressions were exchanged, while the reversed pattern was found at longer delays. Our results demonstrate a crucial role of timing in non-verbal communication, suggesting that temporal relations between facial expressions are processed as social information. Furthermore, our results can inform the implementation of virtual social interactions.

Publication
Journal of Nonverbal Behavior