Understanding others’ affective states is fundamental to successful social interaction. However, inferring such states poses a challenge — emotions and intentions are inherently inaccessible and must be expressed or communicated through observable cues. While facial expressions have been extensively studied, the human body also serves as a key channel for communicating affective intentions, offering high adaptive value. Prior research has identified several factors influencing how we perceive others’ affect, including movement features and the bodily states of the observer. The action observation network (AON) reliably activates during the observation of other people’s actions. Within this network, regions such as the inferior frontal gyrus, inferior parietal lobule, and premotor cortex are thought to support understanding by mapping observed movements onto motor representations, thereby providing a neural foundation for inferring others’ actions and intentions. However, the precise role of the AON in affective intention decoding remains under debate. Emerging evidence also suggests that the observer’s own motor repertoire and physiological states, such as acute inflammation, may modulate how affect is perceived in others’ body movement — yet, how these internal factors interact with specific movement cues is still poorly understood.
This dissertation investigates which features of body movement contribute to affect perception in complex social interactions, and whether similarity between observed and internal motor representations modulates this process. Four experiments were conducted. Project 1 employed a computational feature-based approach to analyze whole-body movements in affective interactions. Project 2 targeted brain regions involved in action observation and valence processing. Project 3 examined how exercise-induced inflammation affects perception of emotional interactions. Project 4 combined methods from Project 1 and 2 to test how movement similarity influences both perception and neural activation.
The findings show that kinematic features support emotion recognition, while postural cues relate more closely to subjective valence ratings. Interactive movement enhances recognition of socially salient emotions such as affection. Neural data revealed that a fronto-parietal network — especially the inferior parietal lobule — encodes valence from movement and responds more strongly to dissimilar movements, suggesting a central role in affective intention decoding. Inflammation was found to alter gaze behavior and reduce both sensitivity and emotion recognition, linking internal altered physiological states to perceptual processes. These results point to a close interaction between movement features, physiological states, movement similarity, and the fronto-parietal system in decoding affective intentions.
Verknüpfung zu Publikationen oder weiteren Datensätzen