Learning and Retrieval of Crossmodally Associated Valence in Face Perception
Doctoral thesis
Date of Examination:2023-03-16
Date of issue:2023-04-18
Advisor:Prof. Dr. Annekathrin Schacht
Referee:Prof. Dr. Annekathrin Schacht
Referee:Dr. Arezoo Pooresmaeili
Files in this item
Name:thesis_ziereis_20230411_online.pdf
Size:19.6Mb
Format:PDF
Description:Dissertation inkl. Appendizes
Abstract
English
How someone’s face is perceived is not only influenced by their facial expression but also largely by the situational context as well as by previous experiences with that person. Although emotional expressions of faces and voices are relevant social cues that have been suggested to enjoy a prioritized role in attentional selection processes, the connection between processing inherent emotional and associated valence is yet not well understood. In this PhD project, a set of behavioral and event-related potential (ERP) studies were conducted to examine the temporal dynamics and degree to which affective social cues are prioritized over neutral social cues and learned under different task constraints. In Studies 1 to 3, faces were cross-modally associated with affective vocalizations. Associated valence in learning and retrieval were tested in a valence-implicit Pavlovian conditioning paradigm (Studies 1 and 2) in which only gender information was task-relevant. In Study 3, the retrieval of faces previously associated with valence was contrasted for a valence-implicit and valence-explicit task. The influence of physical stimulus properties, e.g., frequency spectra and size, on valence effects was addressed in Studies 4 and 5. Although the neural (Study 1) and behavioral (Study 2) results suggested sensitivity for the voices’ valence, there was little evidence for the acquired valence effects in the conditioned faces. In contrast, by relaxing task constraints during the learning of the face-voice pairs (Study 3), effects of associated valence were observable in both valence-implicit and -explicit tasks during retrieval. Effects on early visual processing were not observable for emotional stimuli but shown for the extensively trained stimulus features (gender) in Study 1. At mid-latencies, both positive and negative facial expressions affected ERPs (Study 5), whereas the effects of associated valence were restricted to negative associations. Valence effects on later processing appeared particularly sensitive to task requirements (Studies 3 and 5). In summary, the findings suggest a differential prioritization of valence in emotional expressions of the face and voice compared to associated valence. Moreover, tasks steering attention away from the stimulus’ valence might strongly impair the acquisition of valence-based associations, whereas the retrieval of already acquired associations, similar to inherent emotional expressions, can influence attentional processes also in valence-implicit contexts.
Keywords: ERPs; faces; associative learning; cross-modal; emotion