Early brain responses to affective faces: A simultaneous EEG-fMRI study

Early brain responses to affective faces: A simultaneous EEG-fMRI study

Accepted Manuscript Early brain responses to affective faces: A simultaneous EEG-fMRI study Miriam Müller-Bardorff, Maximilian Bruchmann, Martin Mothe...

2MB Sizes 0 Downloads 59 Views

Accepted Manuscript Early brain responses to affective faces: A simultaneous EEG-fMRI study Miriam Müller-Bardorff, Maximilian Bruchmann, Martin Mothes-Lasch, Pienie Zwitserlood, Insa Schlossmacher, David Hofmann, Wolfgang Miltner, Thomas Straube PII:

S1053-8119(18)30507-X

DOI:

10.1016/j.neuroimage.2018.05.081

Reference:

YNIMG 15006

To appear in:

NeuroImage

Received Date: 6 March 2018 Revised Date:

23 May 2018

Accepted Date: 31 May 2018

Please cite this article as: Müller-Bardorff, M., Bruchmann, M., Mothes-Lasch, M., Zwitserlood, P., Schlossmacher, I., Hofmann, D., Miltner, W., Straube, T., Early brain responses to affective faces: A simultaneous EEG-fMRI study, NeuroImage (2018), doi: 10.1016/j.neuroimage.2018.05.081. This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

ACCEPTED MANUSCRIPT Early brain responses to affective faces: a simultaneous EEG-fMRI study Running head: EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL

EXPRESSIONS

RI PT

Miriam Müller-Bardorff1, Maximilian Bruchmann1, Martin Mothes-Lasch1, Pienie

1

SC

Zwitserlood2, Insa Schlossmacher1, David Hofmann1, Wolfgang Miltner3 & Thomas Straube1

Institute of Medical Psychology and Systems Neuroscience, University of Muenster,

M AN U

Germany 2

Institute of Psychology, University of Muenster, Germany

3

Department of Clinical and Biological Psychology, University of Jena, Germany

Miriam Müller-Bardorff

TE D

Address for correspondence:

EP

Institute of Medical Psychology and Systems Neurosciences

AC C

University of Muenster, Germany Phone: +49 251 83 57174 Fax: +49 251 83 55494

Email: [email protected] Acknowledgments. This research was supported by German Research Foundation (DFG) Project No. STR 987/3-1, STR 987/6-1.

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Abstract The spatio-temporal neural basis of the earliest differentiation between emotional and neutral facial expressions is a matter of debate. The present study used concurrent

RI PT

electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) in order to investigate the ‘when’ and ‘where’ of earliest prioritization of emotional over neutral

expressions. We measured event-related potentials (ERPs) and blood oxygen dependent

SC

(BOLD) signal changes in response to facial expressions of varying emotional intensity and different valence categories. Facial expressions were presented superimposed by two

M AN U

horizontal bars and participants engaged in a focal bars task (low load, high load), in order to manipulate the availability of attentional resources during face perception. EEG data revealed the earliest expression effects in the P1 range (76 – 128 ms) as a parametric function of stimulus arousal independent of load conditions. Conventional fMRI data analysis also

TE D

demonstrated significant modulations as a function of stimulus arousal, independent of load, in amygdala, superior temporal sulcus, fusiform gyrus and lateral occipital cortex. Correspondingly, EEG-informed fMRI analysis revealed a significant positive correlation

EP

between single-trial P1 amplitudes and BOLD responses in amygdala and lateral posterior

AC C

occipital cortex. Our results are in line with the hypothesis of the amygdala as fast responding relevance detector and corresponding effects in early visual face processing areas across facial expressions and load conditions.

2

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

1 Introduction Fast and adaptive responding to biologically significant stimuli such as emotional facial

RI PT

expressions is assumed to rely on enhanced face representations in visual cortices through top-down signals from emotion-related brain areas (e.g., Vuilleumier and Pourtois, 2007, Pessoa and Adolphs, 2010 for reviews). Despite a large number of imaging studies that

SC

characterize the neuronal network involved in affective face processing (e.g., Fusar-Poli et al., 2009, Mende-Siedlecki et al., 2013for reviews), the latency and level of cortical processing, at

M AN U

which initial prioritization of emotional over neutral expressions is realized, still remain a matter of debate.

Electrophysiological methods have been employed to investigate the latency at which responses to emotional versus neutral expressions diverge, but findings are inconclusive so far

TE D

(see Olofsson et al., 2008 for a review). Some scalp-recorded EEG / MEG studies indicated early divergence in the time range of the P1 (~ 100-130 ms, e.g., Pizzagalli et al., 1999, Batty and Taylor, 2003, Pourtois et al., 2005, Holmes et al., 2008), but others did not find

EP

modulations within this early latency (e.g., Balconi and Pozzoli, 2003, Schupp et al., 2004, Caharel et al., 2005, Blau et al., 2007, Arviv et al., 2015, Müller-Bardorff et al., 2016). The P1

AC C

is a positive going deflection, which is modulated by physical stimulus properties as well as endogenous factors such as attention (e.g., Clark and Hillyard, 1996) and the motivational value of a stimulus (e.g., Pourtois et al., 2005, Hammerschmidt et al., 2017). Initial emotional differentiation in the time range of the P1 would indicate neural differentiation at the level of early extrastriate visual cortices (e.g., Di Russo et al., 2002, Woldorff et al., 2002), while 3

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

differentiation at latencies beyond 130 ms would implicate cortical activity of higher cortical areas such as fusiform gyrus and superior temporal sulcus (e.g., Itier and Taylor, 2004, Deffke et al., 2007, Schönwald and Müller, 2014). Similar to scalp-recorded EEG studies, intracranial

RI PT

EEG studies also provide ambiguous answers regarding the moment of earliest discriminative responding to emotional versus neutral facial expressions. Some intracranial EEG studies reported differential neural responses ~74 -150 ms after stimulus onset in the amygdala (e.g.,

SC

Oya et al., 2002, Méndez-Bértolo et al., 2016), a brain structure supposed to establish initial neural differentiation depending on the emotional significance of a stimulus (e.g., Day-Brown

M AN U

et al., 2010, Pessoa and Adolphs, 2010), while others did not observe differential responses to emotional versus neutral expressions before ~150 ms (e.g., Halgren et al., 1994, KrolakSalmon et al., 2004, Pourtois et al., 2010, Meletti et al., 2012, see also Murray et al., 2014 for a review). Remarkably, a scalp-recorded EEG study in patients with amygdala lesions

TE D

suggested altered responsiveness to emotional expressions in occipitally measured P1 amplitudes, supporting the hypothesis of early amygdalar feedback signals on the extrastriate

EP

visual cortex (Rotshtein et al., 2010).

Reasons for the heterogeneities of both scalp-recorded and intracranial electrophysiological

AC C

findings might relate to specific experimental factors such as focal versus peripheral vision (Eimer and Holmes, 2007, Calvo et al., 2008), the spatial arrangement of competing stimuli (e.g., overlapping versus separated arrangement of stimuli and task, Müller and Hübner, 2002, Müller et al., 2008), the role of top-down attention (e.g. involuntary versus voluntary affective processing, Carretié, 2014), and of attentional resources (e.g., Holmes et al., 2003). In 4

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

addition, statistical power might be a critical factor to detect stimulus-driven alterations at early latencies.

RI PT

We assume that P1 effects are most likely observed if the emotional expression of a face is conveyed by its low-level features, especially if these features are characterized by low spatial frequencies (Latinus & Taylor, 2006; Pourtois et al., 2005), whereas N170 effects reflect

SC

configural, integrated face processing (Hinojosa, Mercado, & Carretié, 2015; Latinus &

Taylor, 2006). As N170 effects are also elicited by unattended expressions (Hinojosa et al.,

M AN U

2015) we assume that potential P1 effects should also be independent of attentional resources. In the spatial domain, fMRI with its high spatial resolution is well suited to investigate the neuroanatomic network underlying affective face processing. Several brain regions show enhanced responses to emotional as compared to neutral facial expressions, including visual

TE D

areas such as occipital face area (OFA), fusiform face area (FFA), superior temporal sulcus (STS), and supramodal areas such as the amygdala (AMY, see Whalen et al., 2013, Fusar-Poli et al., 2009, Mende-Siedlecki et al., 2013 for reviews). As mentioned above, the amygdala is

EP

suggested to classify visual input according to its emotional significance (e.g., Day-Brown et al., 2010, Pessoa and Adolphs, 2010) and, in addition, is assumed to participate in differential

AC C

visual processing of affective and neutral stimuli through direct top-down signals (Sugase et al., 1999, Amaral et al., 2003, Phelps and LeDoux, 2005, Vuilleumier and Driver, 2007). OFA, FFA, and STS are suggested to establish face representations in a hierarchical fashion, starting with basic facial features, and proceeding to a more abstract representation of facial identity and expression (e.g., Haxby et al., 2000, Ishai, 2008, Liu et al., 2010). Controversies 5

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

exist, however, regarding the level at which emotional information impacts face processing. Some authors suggested expression-related processing already at the level of the OFA (e.g., Pitcher et al., 2007, Atkinson and Adolphs, 2011, Pitcher, 2014), while others suggest

and Gobbini, 2012, Reinl and Bartels, 2014).

RI PT

emotional differential responses at higher hierarchical levels (e.g., Ganel et al., 2005, Haxby

SC

A further open question concerns the stimulus factors that drive modulations by significant emotional expressions in affective face processing. Several EEG and fMRI studies showed

M AN U

modulations by emotional expressions specifically for threat-related, negative facial expressions (e.g., Leppänen et al., 2007, Adolphs, 2008, Yang et al., 2012, Kim et al., 2017) – with some studies emphasizing a specific role for negative expressions such as fear (e.g., Calder, 1996, Adolphs, 2002, Feinstein et al., 2011) or for specific face regions such as the

TE D

eyes (e.g., Hardee et al., 2008, Rutishauser et al., 2015). In contrast, other EEG and fMRI studies observed no valence-specific enhancements, but found modulations as a function of stimulus arousal and general stimulus relevance (see Olofsson et al., 2008, Costafreda et al.,

EP

2008, Cunningham and Brosch, 2012, Murray et al., 2014, Lindquist et al., 2016 for reviews). Since orthogonal manipulations of stimulus valence and stimulus arousal are rarely

AC C

implemented, individual contributions of stimulus valence and arousal are difficult to disentangle. Studies that controlled for emotional intensity and associated stimulus arousal reported no differences between valence categories (positive, negative), but instead found modulations as a function of emotional intensity and associated stimulus arousal (e.g., MüllerBardorff et al., 2016, Lin et al., 2016). In these studies from our group we used two intensity 6

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

levels within each valence category plus a neutral category to model U-shaped effects of arousal (going from high to low intensity negative expression over neutral to low and then

We adopted the same strategy in the present paper.

RI PT

high intensity positive expressions) and compared these to linear effects of stimulus valence.

Despite a large number of EEG and fMRI studies on affective face processing, evidence from

SC

the two methods remains largely unconnected. As a consequence, due to the limited temporal resolution of fMRI studies and the limited spatial resolution of electrophysiological studies,

M AN U

the ‘when’ and ‘where’ of initial prioritization of emotional over neutral expressions remains ill-defined. One way to overcome the limitations of each method is to combine them. Several studies performed so called fMRI-constrained EEG source analyses, which restrict the search space for the estimation of neural sources of evoked potentials to cortical regions identified

TE D

via fMRI (e.g. Sabatinelli, Lang, Keil, & Bradley, 2007; Trautmann-Lengsfeld, DomínguezBorràs, Escera, Herrmann and Feher, 2013). To our knowledge, only a single study employed fMRI-constrained EEG source analyses to study affective face processing (Trautmann-

EP

Lengsfeld et al, 2017). The authors estimated the sources in the time window of N170 (140 – 190 ms), early posterior negativity (EPN, 250 – 360 ms) and late positive potential (LPP; 600

AC C

– 800 ms) and compared neutral faces to faces expressing either disgust or happiness. Sources showing the earliest differentiation of static facial expressions were found in fusiform gyrus in the time window of the LPP, although statistical trends were observed also in earlier intervals. Considerably earlier effects were found with dynamic faces, starting around 200 ms.

7

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

A different approach are so called EEG-informed fMRI analyses, which use information from trial-by-trial variability of evoked potentials for a more accurate model of single-trial BOLD responses. This method has been applied in studies on non-affective face processing and

RI PT

provided valuable insights into the relation between electrophysiological face-sensitive

markers and the activation of different hierarchical levels of the face-processing network (Campanella et al., 2013, Nguyen and Cunnington, 2014, Wirsich et al., 2014). In contrast to

SC

fMRI-constrained EEG source analyses, this method requires EEG and fMRI to be measured simultaneously. While the former approach focuses on revealing the precise time course of

M AN U

activation in regions restricted to BOLD effects, the latter approach focuses on enhancing fMRI results by providing it with temporally defined ERP information not inherent to the stimulation protocol. No study so far employed EEG-informed fMRI analysis. While we see merit in both of these approaches, we chose EEG-informed fMRI-analyses as we wanted to

TE D

reveal BOLD variations explained by the earliest evoked potentials that are sensitive to affective information. Following this logic, the present simultaneous EEG-fMRI study investigated the time course of the initial neural differentiation between emotional and neutral

EP

expressions, using complementary information from both methods. In particular, we

AC C

investigated (1) which ERP components correspond with the initial extraction of affective information from facial expressions, (2) whether stimulus valence or stimulus arousal modulate these components and (3) which brain regions are associated with early differential electrophysiological responses on a single trial level, and (4) whether findings are modulated by the availability of attentional resources.

8

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

To pursue this aim, we presented facial expressions that systematically varied in terms of stimulus valence and stimulus arousal, using, positive and negative expressions of matched intensities plus neutral faces. Facial stimuli were presented at display center, overlapping with

RI PT

the focal task stimuli (see below), and were always task-irrelevant, triggering exogenous

attention (see Carretié, 2014 for a review). The availability of attentional resources during affective face processing was manipulated by the perceptual load of the focal task (low load

SC

vs. versus high load, see Lavie et al., 2003 and Müller-Bardorff et al., 2016). This factor

allowed us to test whether the earliest electrophysiological correlates of facial expression

M AN U

were already modulated by perceptual load. 2 Method

Forty-three healthy, right-handed university students with no history of psychiatric or

TE D

neurological illness participated for money or course credit. Nine participants were excluded due to missing EEG markers, excessive head movement ( > 3 mm in any direction), or strong artefacts in the EEG (more than 20% trials lost due to voltage differences > 200 µV) The

EP

remaining 34 participants (mean age: 23.05 years, SD = 2.29) fulfilled the criteria for integrated EEG-fMRI analyses. All participants gave informed consent prior to their

AC C

participation. The study was approved by the Ethics Committee of the University of Jena. 2.2 Stimuli and Task

Stimulus material was selected from the Jena 3D Face Database (J3DFD), which provides graded intensity levels of each emotion (naturalistic intensity manipulation, no morphing 9

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

procedure, see Müller-Bardorff et al., 2016). Stimuli consisted of 120 photos (from 9 females, 9 males, mean age = 22.50 years, SD = 1.72) selected from the J3DFD displaying neutral, happy (low, high), angry (low, high), or fearful (low, high) expressions. Angry and fearful

RI PT

faces were used in different groups, but pooled in the analysis, to control for possible differences in low level features of negative expressions. Selection criteria were high

recognizability of emotions and clearly distinguishable intensity levels (low, high). A video

SC

beamer projected the image onto a semitransparent screen positioned at the foot end of the scanner. Participants viewed the screen through a mirror attached to the head coil. The mirror

M AN U

was positioned approx. 15 cm above the eyes, thus 1 cm corresponded to 3.82 °. Each trial started with a central fixation cross presented for 500 ms, followed by a face (~3.3 ° × 4.0°) overlaid with two horizontal bars (~0.6°) for 150 ms (see Figure 1), with a variable interstimulus-interval ranging between 2170 and 5965 ms (mean ISI = 4008ms). Half of the

TE D

participants indicated the location of the longer bar, the other half the location of the shorter bar by button press. All participants saw neutral and happy expressions and – depending on the experimental version – either angry (n = 19) or fearful (n= 15) expressions to avoid

EP

unbalanced frequencies of negative and positive expressions. The length difference between

AC C

both bars was 0.04° (8 px) under high perceptual load and 0.2° (32 px) under low perceptual load. Responses were given by button press (index finger: ‘left side’, middle finger: ‘right side’) and participants completed 1296 trials, amounting to 72 trials per condition (1296 trials = 3 valence categories (negative, positive, neutral) × 2 intensity levels (except for neutral expressions) × 2 load levels × 2 target locations × 18 facial identities × 3 runs). Both accuracy and speed were emphasized. Stimuli were delivered in a pseudo-randomized order that was 10

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

balanced across participants and conditions. Stimulus presentation and response collection were controlled by the software Presentation (Version 16.5, Neurobehavioral Systems ©,

RI PT

Albany, CA). 2.3 Data acquisition

fMRI data were acquired at 3T on a Siemens Magnetom Trio (Siemens Medical Systems)

SC

using a standard 12-channel Siemens Head Matrix Coil. Simultaneously, EEG data from 63 scalp electrodes were recorded. Three runs of 655 volumes, each consisting of 35 slices (slice

M AN U

thickness = 3 mm; interslice gap = 0.50 mm; in-plane resolution = 3 × 3 mm2, interslice time = 60 ms) were recorded by means of a T2*-weighted gradient-echo, echoplanar sequence with a repetition time (TR) of 2080 ms, an echo time (TE) of 30 ms, and a flip angle (FA) of 90°, yielding a data matrix of 64 x 64 voxels within a field of view (FOV) of 192 mm.

TE D

Additionally, a T1-weighted MPRAGE structural volume in high-resolution (192 slices) was recorded for anatomical localization. To minimize external magnetic field inhomogeneities, a shimming field was applied before functional imaging.

EP

Concurrent EEG data were recorded from 63 sintered Ag/AgCl ring electrodes mounted

AC C

within an MRI-compatible cap (BRAINCAP-MR, EASYCAP) fitted on the scalp so that 21 electrodes corresponded to the international 10 - 20 system, with the remaining 42 interspaced equally in this system. One additional electrode at the left-side back monitored electrocardiac activity, one below the left eye registered electro-ocular activity. Electrode gel (Abralyt, 2000, EASYCAP) was applied to electrode-scalp junctions until electrode impedances fell within the range of 0–10 kΩ. A BrainAmp-MR amplifier (BrainProducts) was fixed inside the 11

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

scanner bore and connected to a PC in the console room via fiber-optic cable, where the EEG was recorded with a sampling rate of 5000 Hz, an online bandpass filter of 0.016–250 Hz, the recording reference at frontocentral electrode FCz, and the ground at electrode Iz. The

RI PT

BrainProducts Sync-Box was used to synchronize the scanner gradient and EEG acquisition system clocks.

SC

2.4 EEG analysis

The raw EEG signal was preprocessed by first removing the gradient and ballistocardiogram

M AN U

artifact using the EEGFMRIArtifactPlugin for BrainVisionAnalyzer 2.0 Software (Brain Products). The gradient artifact was removed by the average artifact subtraction method proposed by Allen and colleagues (2000). The ballistocardiogram (BCG) artifact was removed using the Optimal Basis Set (OBS) approach with non-linear time modeling of the

TE D

heart pulses after semi-automatic detection of the heartbeat (Debener et al., 2007). The signal time course was reconstructed from adjacent electrodes by topographic interpolation if channel variance exceeded certain thresholds (z > 3). EEG data were bandpassfiltered (2 -

EP

30Hz, downsampled to 250 Hz, and rereferenced to common average. To further increase signal-to-noise ratio, single trial responses were estimated from EEG data via wavelet

AC C

denoising (Quiroga and Garcia, 2003). Based on the decomposition of the mean ERP into five different scales, a set of wavelet coefficients was chosen for decomposition and inverse transformation of single trial data (the set size of wavelet coefficients was kept constant for all participants and channels).

12

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Arousal and valence effects were investigated using balanced contrast weights derived from arousal and valence ratings from a separate, representative sample of 46 subjects (reported in Müller-Bardorff et al., 2016). The resulting u-shaped arousal contrast vectors were [1.40,

RI PT

0.07,-2.25,-0.46, 1.24; angry high, angry low, neutral, happy low, happy high] and [1.35, 0.24,-2.29,-0.56, 1.26; fearful high, fearful low, neutral, happy low, happy high]. The linear valence contrast vectors were [-2.55,-1.65, 0.39, 1.96, 1.85] and [-2.15,-1.40, 0.15, 1.75,

SC

1.65], respectively. Contrast vectors were applied to responses across task conditions

(‘valence model’, arousal model’) and as interaction contrasts dependent on load condition

M AN U

(‘load × valence model’, ‘load × arousal model’). Statistical tests were based on cluster-based permutation using the FieldTrip toolbox in MATLAB (Groppe et al., 2011, Maris and Oostenveld, 2007, Oostenveld et al., 2011). The cluster-based permutation targeted valence and arousal effects by using the respectively weighted ERP-amplitudes as dependent variable,

TE D

to identify the earliest influences of facial expressions on electrophysiological responses. Clusters were formed by two or more neighboring sensors (in time and space), whenever the t-values exceeded the cluster threshold (α = .05). Cluster mass, sum(t), was calculated by

EP

adding all t-values within a cluster. The number of permutations was set to 1000. Tests were

AC C

two-sided, hence the significance threshold for testing the null hypothesis was α = .025. The time interval tested included time points from 50 to 200 ms, since the study aimed at identifying the earliest influences of valence and arousal. 2.5 fMRI analysis

13

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Preprocessing and analysis of the functional data were performed using Brain Voyager QX 1.10 and Brain 2.6 Statistical analysis Voyager QX 2.4 software (BVQX; Brain Innovation). The first four volumes of each run were discarded as dummies to ensure steady-state tissue

RI PT

magnetization. Realignment to the first volume of each run was performed via least-squares estimation of six rigid body parameters to reduce effects of head movements on volume time course analysis. Further data preprocessing comprised a correction for slice time errors and

SC

spatial (8 mm FWHM isotropic Gaussian kernel) as well as temporal (high pass filter: 10 cycles per run; low pass filter: 2.8 s FWHM; Linear Trend Removal) smoothing. Anatomical

M AN U

and functional images were coregistered and normalized to the Talairach space (Talairach and Tournoux, 1988). The GLM was calculated with adjustment for autocorrelation following a global AR (2) model. The expected BOLD signal change for each predictor was modeled by convolving the event reference functions with a two-gamma hemodynamic response function

TE D

(HRF). Potential modulations by stimulus valence and stimulus arousal were addressed using the same contrast models as in the EEG analysis (see section 2.4 EEG analysis). A ROI analysis was performed, which included bilateral AMY, fusiform gyrus, STS, and occipital

EP

cortex (including lateral occipital areas) as regions of interest (see Ishai, 2008). These regions

AC C

were defined on the basis of the Automated Anatomical Labeling (AAL) atlas as implemented in the Wake Forest University (WFU) PickAtlas software (Maldjian et al., 2003). Significant voxels were obtained through permutation analysis with 10 000 sign-flipping permutations along with threshold-free-cluster enhancement (TFCE; Smith and Nichols, 2009) using the standard settings as implemented in the Permutation Analysis of Linear Models (PALM, version alpha102) toolbox (Winkler et al., 2014). The resulting statistical maps were corrected 14

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

for multiple comparisons at a family-wise error rate of α = .05. The TFCE approach was chosen, in order to include spatial neighboring information without the need for setting an arbitrary initial cluster-forming threshold (see Smith and Nichols, 2009, Salimi-Khorshidi et

RI PT

al., 2011). 2.6 EEG-informed fMRI analysis

SC

In line with the integration-by-prediction method (see Debener, 2005, Mulert et al., 2008, Eichele, 2009), a wavelet-denoised (see Quiroga and Garcia, 2003), mean-centered ERP

M AN U

estimate of the earliest peak amplitude of emotional discrimination was used for EEGinformed fMRI analysis. Since the cluster-based permutation test revealed a strong effect in the P1 range, single trial P1 amplitudes were extracted for each participant using the abovedescribed wavelet denoising method. P1 amplitudes were defined as the mean over a 20 ms

TE D

time window around the individual P1 peak latency. The P1 latency was identified as positive peak within 90 to 180 ms, individually for each participant. These wavelet-denoised single trial amplitudes were entered as additional regressor, besides a uniform stimulus predictor,

EP

into GLM analysis and convolved with the two-gamma HRF function across all experimental conditions to achieve higher statistical power (e.g. Vos et al., 2013, Lei et al., 2012). Thus, the

AC C

resulting GLM encompassed a unified regressor with unit heights modelling stimulus onset and its single-trial EEG-weighted version. ROI specifications and statistical method were analogous to the standard fMRI analysis described above. 3 Results

15

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

3.1 Behavioral Data A 5 × 2 repeated measures ANOVA with expression and load as within-subjects factors

RI PT

confirmed a significant effect of load on reaction time (F (4,32) = 99.49, p <.001, ƞ²p = .76) and accuracy (F (4,32) = 614.58, p <.001, ƞ²p = .95). Participants responded faster (M =

589ms, SE = 19ms) and more accurately (M = 0.95, SE = 0.01) under low than under high

SC

load (M = 701ms, SE = 25ms; M = 0.74, SE = 0.01), thus confirming a successful load

3.2 Electrophysiological Data

M AN U

manipulation by the experimental task. No further effects reached significance (all Ps >.076).

The cluster-based permutation test for the arousal model revealed a significant positive cluster at posterior electrodes between 76 and 128 ms (positive cluster: sum(t) = 396.73, p < .001, see Figure 1C for visualization of the positive cluster). In the same time range a corresponding

TE D

significant negative cluster was observed at frontal electrodes (negative cluster: sum(t) = 392.15, p = .002). In the cluster-based permutation targeting valence, no significant clusters were found (maximal positive cluster: sum(t) = 67.00, p = .11; maximal negative cluster:

EP

sum(t) = -6.78, p = .40). Likewise, there were no significant clusters, neither for the arousal ×

AC C

load (maximal positive cluster: sum(t) = 6.81, p = .445; no negative cluster found), nor for the valence × load model (no positive or negative cluster found). As shown in Figure 1 the positive arousal cluster corresponded spatially and temporally to a P1 cluster. A post-hoc 5 × 2 repeated measures ANOVA on the averaged amplitudes within this cluster confirmed that only expression had a significant impact on the amplitudes, F(4,32) 16

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

= 5.769, p < .001. There was no main effect of load (p = .711) or an interaction of load any

M AN U

SC

RI PT

expression (p = .496) within this cluster.

Figure 1. A. Example of experimental stimuli of the present study. B. Modelled Expression effects based on stimulus ratings by an independent representative sample. C. Results of the

TE D

cluster-based permutation analysis for the arousal weighted ERP amplitudes warousal. Topographies depict mean t-values in the time frame of the significant cluster, which was in

EP

the P1 range. The significant cluster comprised the electrodes marked as bold. Contrast weighted ERPs were averaged from all electrodes marked in bold. The dashed box marks the

AC C

interval tested by cluster-based permutation, the yellow box denotes the time interval of the significant cluster. D. Waveform averages in the time range of the P1 measured at O1 and O2. The bars plots indicate the observed modulations by facial expressions within a fixed time interval around the mean peak latency [117 ms], separately for low and high load.

3.3 Conventional fMRI Data 17

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

The GLM modelling arousal effects revealed significant clusters in AMY (left peak x, y, z: 24, 4, -14, t = 3.49, p = .039, k = 2; right peak x, y, z: 24, 1, -17, t = 3.18, p = .033, k = 7), fusiform gyrus (left peak x, y, z: -39, -46, -8, t = 3.58, p = .023, k = 97; right peak x, y, z: 39, -

RI PT

46, -8, t = 4.48, p = .032, k = 11), right STS (peak x, y, z: 45, -46, 10, t = 3.45, p = .034, k = 42) and left middle occipital gyrus including the posterior lateral occipital cortex (first peak x, y, z: -36, -70, -11, second peak x, y, z: -24, -85, 10, third peak x, y, z: -24, -95, 13, t = 3.29, p

SC

= .025, k = 340; see Figure 2 for the visualization of selected regions). No other model

AC C

EP

TE D

M AN U

(valence, arousal × load, valence × load) revealed significant activation clusters.

Figure 2. Significant activation clusters in amygdala and temporo-occipital brain regions reflecting BOLD responses following stimulus arousal (t-maps, αvoxeltreshold = 0.005, visualization purpose only) and corresponding beta estimates for significant clusters. 18

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

3.4 EEG-informed fMRI Data To identify brain regions showing stimulus-driven BOLD responses, that correlated with

RI PT

single-trial P1 amplitudes to facial expressions, wavelet-denoised single trial estimates were entered as additional regressor into GLM analysis. Our results show significant clusters in left AMY (peak voxel x, y, z: -24, 2, -17,t = 3.08, p = .044, , k = 5), and middle occipital gyrus,

SC

including lateral occipital areas, and left angular gyrus (left cluster: first peak x, y, z: -21, -91, -2, second peak x, y, z: -33, -88, -5, t = 3.07, p = .030, k = 325; right cluster: first peak x, y, z:

M AN U

33, -85, 10, second peak x, y, z: 33, -82, -5, t = 2.99, p = .041, k = 170, see Figure 3. Results suggest that activation in these regions was positively correlated with single-trial P1

AC C

EP

TE D

amplitudes.

Figure 3.EEG-informed fMRI results. A unified regressor modeling face onset and waveletdenoised single-trial P1 estimates were introduced into regression analysis and yielded significant activation in AMY and lateral posterior occipital cortex, (t-maps, αvoxeltreshold = 0.005, visualization purpose only). 19

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

4 Discussion The present study examined the initial neural differentiation of task-irrelevant facial

RI PT

expressions of varying stimulus valence and stimulus arousal using simultaneous EEG-fMRI recordings. We provide evidence that early electrophysiological responses to facial

expressions were independent of task load, reflected modulations by stimulus arousal rather

SC

than stimulus valence, and began to unfold 76 ms after stimulus onset, thus falling within the time range of the P1. Most importantly, early electrophysiological modulations by emotional

M AN U

expression were systematically associated with single trial responses of amygdala, and posterior parts of the lateral occipital cortex, indicating that these brain regions were involved in the initial establishment of differential responses to emotional versus neutral facial expressions.

TE D

The latency at which emotional signals are encoded has been a matter of debate (see Olofsson et al., 2008, Murray et al., 2014 for reviews on scalp-recorded and intracranial EEG findings). Early influences of emotional content on early stages of face processing are not easy to detect,

EP

probably because they constitute subtle signal differences in the presence of strong noise. Thus, detection of such early modulations requires certain preconditions such as high

AC C

statistical power and particular experimental conditions such as emotionally salient distractors and a favorable arrangement of stimuli and task (e.g., Müller et al., 2008). In contrast to EEG studies that found no affective modulations in the P1 range, the current study did observe enhancements for arousing positive and negative expressions that were in foveal vision and fell within the focus of spatial attention. Using a large sample size and an analytical approach 20

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

that modeled neural responses based on normative valence and arousal ratings, we managed to circumvent limitations even from our own previous research (see also Müller-Bardorff et

RI PT

al., 2016). Our finding that the P1 window reveals increased responses to positive and negative

expressions is in line with results from previous scalp recorded (e.g., Pizzagalli et al., 1999,

SC

Pourtois et al., 2005, Muench et al., 2016) and findings from intracranial EEG studies (e.g., Méndez-Bértolo et al., 2016). Furthermore, our results add to the converging evidence on

M AN U

early modulation by emotional content across a wide range of visual stimuli (e.g., Brosch et al., 2008) that reflects processing beyond emotion-irrelevant low level factors (e.g., Liu et al., 2002, Pitcher, 2014, Muench et al., 2016). Note that this does not exclude the possibility that emotion-related low-level features in emotional facial expressions are relevant for early ERP

TE D

effects. In our EEG analysis, stimulus arousal explained large proportions of the stimulusrelated variance, even though inspection of Fig. 1 suggests, that there might be interactions between stimulus valence and stimulus arousal. Importantly, neither the EEG data, nor the

EP

fMRI data suggested general effects of stimulus valence, when emotional intensity and corresponding arousal was taken into account (see also Müller-Bardorff et al., 2016, Lin et al.,

AC C

2016).

With regard to conventional fMRI analysis, our data showed arousal-dependent modulation in amygdala, FG, STS and lateral OG by task-irrelevant emotionally significant expressions, which supports the role assumed for these regions in affective face perception (e.g., Ishai, 2008, Haxby and Gobbini 2012, Pitcher, 2014, Lin et al., 2016) and modulatory effects of 21

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

emotional intensity on amygdalar responses to affective faces as previously demonstrated by Lin and colleagues (2016). The present findings on amygdalar responses to affective faces are consistent with a growing number of fMRI studies that argue for a broader functional role of

RI PT

the amygdala in the evaluation of overall stimulus significance across a wide range of visual stimuli irrespective of stimulus valence (Fitzgerald et al., 2006, van der Gaag, Christiaan et al., 2007, Santos et al., 2011, Ousdal et al., 2012, Murray et al., 2014, Fastenrath et al., 2014,

SC

Lindquist et al., 2016). Notably, neither EEG nor fMRI data revealed interactions between stimulus- and task-driven effects, suggesting that modulations by emotional significance did

M AN U

not depend on the perceptual load of the focal task (see also Müller et al., 2008, Itier and Neath-Tavares, 2017). Whether a further increase of perceptual load or a non-simultaneous onset of task and face stimuli changes these outcomes remains to be investigated in future

TE D

studies (e.g., Pessoa et al., 2002, Mothes-Lasch et al., 2011).

Although standard fMRI analysis pointed towards several visual areas as a function of stimulus arousal, BOLD effects within FG and STS did probably not reflect initial evaluations

EP

of stimulus significance (e.g., Monroe et al., 2013, Nguyen and Cunnington, 2014). EEGinformed fMRI analysis allowed us to integrate temporally anchored single-trial information,

AC C

thus refining model estimations regarding BOLD effects related to early stimulus-driven electrophysiological alterations. This analysis revealed mainly brain regions operative at early stages of face processing, including the parts of the lateral occipital gyrus assumed to correspond with the OFA (see Pitcher et al., 2011). While several parts of the posterior occipital cortex should not reflect face-specific processing itself, the OFA is believed to 22

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

reflect the initial stage of the cortical face processing network, responsible for the extraction of featural and emotional information from faces (e.g., Pitcher et al., 2007, Dzhelyova et al., 2011, Atkinson and Adolphs, 2011) and the initial construction of face representations (e.g.,

RI PT

Haxby et al., 2000, Steeves et al., 2006,Liu et al., 2010). Thus, the results of our EEG-

informed fMRI analysis indicate that arousing emotional content begins to modulate face

SC

processing at the level of the OFA, thus affecting the earliest stage of face processing.

The EEG-informed fMRI analysis revealed a significant cluster in the amygdala associated

M AN U

with single trial occipital P1 amplitudes. This fits well with the presumed role of the amygdala as relevance detector (e.g.,Phelps and LeDoux, 2005) and putative source of topdown modulation on neural activity in visual brain areas (see Vuilleumier and Pourtois, 2007, Pessoa and Adolphs, 2010 for reviews). In this vein, an intriguing EEG study provided

TE D

evidence for a causal relationship between P1 amplitudes and amygdala responses to emotionally significant facial stimuli. Using fearful and neutral expressions Rothsthein and colleagues (2010) demonstrated diminished P1 effects in amygdala-damaged patients, and an

EP

inverse relationship between neural differentiation in the P1 range and the extent of amygdalar damage. Our findings, showing a significant correlation between amygdala

AC C

responses and ERP amplitudes in the time range 76 - 128 ms, are in accordance with the EEG study of Rothsthein and colleagues. Note that differential amygdalar responding might not be restricted to the early time window of the P1, but possibly occur in multiple sweeps (Pourtois et al., 2010, Rotshtein et al., 2010, Huijgen et al., 2015).

23

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

To conclude, our results showed that electrophysiological responses to task-irrelevant, emotionally significant facial expressions occurred 76 -126 ms after stimulus onset, did not depend on load conditions, and were systematically associated with stimulus-driven BOLD

RI PT

responses of amygdala and lateral posterior occipital cortex on a single-trial basis. Our results are in line with a role for the amygdala as fast responding relevance detector, which rapidly biases stimulus processing in direction of the emotionally significant stimulus at least partially

AC C

EP

TE D

M AN U

SC

irrespective of top-down attention.

24

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

References Adolphs, R., 2002. Neural systems for recognizing emotion. Current Opinion in Neurobiology 12 (2), 169–177. 10.1016/S0959-4388(02)00301-X.

18 (2), 166–172. 10.1016/j.conb.2008.06.006.

RI PT

Adolphs, R., 2008. Fear, faces, and the human amygdala. Current Opinion in Neurobiology

Allen, P.J., Josephs, O., Turner, R., 2000. A method for removing imaging artifact from

SC

continuous EEG recorded during functional MRI. NeuroImage 12 (2), 230–239. 10.1006/nimg.2000.0599.

M AN U

Amaral, D.G., Capitanio, J. P.: Jourdain, M., Mason, W.A., Mendoza, S.P., Prather, M., 2003. The amygdala: is it an essential component of the neural network for social cognition? Neuropsychologia 2 (41), 253-240.

Arviv, O., Goldstein, A., Weeting, J.C., Becker, E.S., Lange, W.-G., Gilboa-Schechtman, E.,

TE D

2015. Brain response during the M170 time interval is sensitive to socially relevant information. Neuropsychologia 78, 18–28. 10.1016/j.neuropsychologia.2015.09.030. Atkinson, A.P., Adolphs, R., 2011. The neuropsychology of face perception: Beyond simple

EP

dissociations and functional selectivity. Philosophical transactions of the Royal Society of

AC C

London. Series B, Biological sciences 366 (1571), 1726–1738. 10.1098/rstb.2010.0349. Balconi, M., Pozzoli, U., 2003. Face-selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. International Journal of Psychophysiology 49 (1), 67–74. 10.1016/S0167-8760(03)00081-3. Batty, M., Taylor, M.J., 2003. Early processing of the six basic facial emotional expressions. Cognitive Brain Research 17 (3), 613–620. 10.1016/S0926-6410(03)00174-5. 25

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Blau, V.C., Maurer, U., Tottenham, N., McCandliss, B.D., 2007. Behav Brain Funct 3 (1), 7. 10.1186/1744-9081-3-7.

362–370. 10.1111/j.1467-9280.2008.02094.x.

RI PT

Brosch, T., Sander, D., Pourtois, G., Scherer, K.R., 2008. Beyond Fear. Psychol Sci 19 (4),

Caharel, S., Courtay, N., Bernard, C., Lalonde, R., Rebaï, M., 2005. Familiarity and emotional expression influence an early stage of face processing: An electrophysiological study.

SC

Brain and Cognition 59 (1), 96–100. 10.1016/j.bandc.2005.05.005.

Calder, A.J., 1996. Facial Emotion Recognition after Bilateral Amygdala Damage:

10.1080/026432996381890.

M AN U

Differentially Severe Impairment of Fear. Cognitive Neuropsychology 13 (5), 699–745.

Calvo, M.G., Nummenmaa, L., Hyönä, J., 2008. Emotional scenes in peripheral vision: Selective orienting and gist processing, but not content identification. Emotion 8 (1), 68–

TE D

80. 10.1037/1528-3542.8.1.68.

Campanella, S., Bourguignon, M., Peigneux, P., Metens, T., Nouali, M., Goldman, S., Verbanck, P., Tiège, X. de, 2013. BOLD response to deviant face detection informed by

EP

P300 event-related potential parameters: A simultaneous ERP-fMRI study. NeuroImage

AC C

71, 92–103. 10.1016/j.neuroimage.2012.12.077. Carretié, L., 2014. Exogenous (automatic) attention to emotional stimuli: A review. Cognitive, affective & behavioral neuroscience 14 (4), 1228–1258. 10.3758/s13415-0140270-2.

26

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Clark, V.P., Hillyard, S.A., 1996. Spatial selective attention affects early extrastriate but not striate components of the visual evoked potential. Journal of Cognitive Neuroscience 8 (5), 387–402. 10.1162/jocn.1996.8.5.387.

RI PT

Costafreda, S.G., Brammer, M.J., David, A.S., Fu, C.H.Y., 2008. Predictors of amygdala activation during the processing of emotional stimuli: A meta-analysis of 385 PET and fMRI studies. Brain Research Reviews 58 (1), 57–70. 10.1016/j.brainresrev.2007.10.012.

54–59. 10.1177/0963721411430832.

SC

Cunningham, W.A., Brosch, T., 2012. Motivational Salience. Curr Dir Psychol Sci 21 (1),

M AN U

Day-Brown, J.D., Wei, H., Chomsung, R.D., Petry, H.M., Bickford, M.E., 2010. Pulvinar Projections to the Striatum and Amygdala in the Tree Shrew. Front. Neuroanat. 4. 10.3389/fnana.2010.00143.

Debener, S., 2005. Trial-by-Trial Coupling of Concurrent Electroencephalogram and

TE D

Functional Magnetic Resonance Imaging Identifies the Dynamics of Performance Monitoring. Journal of Neuroscience 25 (50), 11730–11737. 10.1523/JNEUROSCI.328605.2005.

EP

Debener, S., Strobel, A., Sorger, B., Peters, J., Kranczioch, C., Engel, A.K., Goebel, R., 2007.

AC C

Improved quality of auditory event-related potentials recorded simultaneously with 3-T fMRI: Removal of the ballistocardiogram artefact. NeuroImage 34 (2), 587–597. 10.1016/j.neuroimage.2006.09.031. Deffke, I., Sander, T., Heidenreich, J., Sommer, W., Curio, G., Trahms, L., Lueschow, A., 2007. MEG/EEG sources of the 170-ms response to faces are co-localized in the fusiform gyrus. NeuroImage 35 (4), 1495–1501. 10.1016/j.neuroimage.2007.01.034. 27

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Di Russo, F., Martínez, A., Sereno, M.I., Pitzalis, S., Hillyard, S.A., 2002. Cortical sources of the early components of the visual evoked potential. Hum. Brain Mapp. 15 (2), 95–111. Dzhelyova, M.P., Ellison, A., Atkinson, A.P., 2011. Event-related repetitive TMS reveals

RI PT

distinct, critical roles for right OFA and bilateral posterior STS in judging the sex and trustworthiness of faces. Journal of Cognitive Neuroscience 23 (10), 2782–2796. 10.1162/jocn.2011.21604.

SC

Eichele, T., 2009. Mining EEG–fMRI using independent component analysis. International Journal of Psychophysiology 73 (1), 53–61. 10.1016/j.ijpsycho.2008.12.018.

M AN U

Eimer, M., Holmes, A., 2007. Event-related brain potential correlates of emotional face processing. Neuropsychologia 45 (1), 15–31. 10.1016/j.neuropsychologia.2006.04.022. Fastenrath, M., Coynel, D., Spalek, K., Milnik, A., Gschwind, L., Roozendaal, B., Papassotiropoulos, A., de Quervain, D. J. F., 2014. Dynamic Modulation of Amygdala-

TE D

Hippocampal Connectivity by Emotional Arousal. Journal of Neuroscience 34 (42), 13935–13947. 10.1523/JNEUROSCI.0786-14.2014. Feinstein, J.S., Adolphs, R., Damasio, A., Tranel, D., 2011. The Human Amygdala and the

EP

Induction and Experience of Fear // The human amygdala and the induction and

AC C

experience of fear. Current Biology 21 (1), 34–38. 10.1016/j.cub.2010.11.042. Fitzgerald, D.A., Angstadt, M., Jelsone, L.M., Nathan, P.J., Phan, K.L., 2006. Beyond threat: Amygdala reactivity across multiple expressions of facial affect. NeuroImage 30 (4), 1441–1448. 10.1016/j.neuroimage.2005.11.003. Fusar-Poli, P., Placentino, A., Carletti, F., Landi, P., Allen, P., Surguladze, S., Benedetti, F., Politi, P., 2009. Functional atlas of emotional faces processing: a voxel-based meta28

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

analysis of 105 functional magnetic resonance imaging studies. Journal of Psychiatry & Neuroscience 34 (6), 418–432. Ganel, T., Valyear, K.F., Goshen-Gottstein, Y., Goodale, M.A., 2005. The involvement of the

RI PT

"fusiform face area" in processing facial expression. Neuropsychologia 43 (11), 1645– 1654. 10.1016/j.neuropsychologia.2005.01.012.

Groppe, D.M., Urbach, T.P., Kutas, M., 2011. Mass univariate analysis of event-related brain

SC

potentials/fields I: A critical tutorial review. Psychophysiology 48 (12), 1711–1725. 10.1111/j.1469-8986.2011.01273.x.

M AN U

Halgren, E., Baudena, P., Heit, G., Clarke, M., Marinkovic, K., 1994. Spatio-temporal stages in face and word processing. 1. Depth recorded potentials in the human occipital and parietal lobes. Journal of Physiology-Paris 88 (1), 1–50. 10.1016/0928-4257(94)90092-2. Hammerschmidt, W., Sennhenn-Reulen, H., Schacht, A., 2017. Associated motivational

TE D

salience impacts early sensory processing of human faces. NeuroImage 156, 466–474. 10.1016/j.neuroimage.2017.04.032.

Hardee, J.E., Thompson, J.C., Puce, A., 2008. The left amygdala knows fear: Laterality in the

EP

amygdala response to fearful eyes. Soc Cogn Affect Neurosci 3 (1), 47–54.

AC C

10.1093/scan/nsn001.

Haxby, J., & Gobbini, M. 2012. Distributed Neural Systems for Face Perception. In Oxford Handbook of Face Perception. Oxford University Press. Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2000. The distributed human neural system for face perception. Trends in Cognitive Sciences 4 (6), 223–233. 10.1016/S13646613(00)01482-0. 29

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Hinojosa, J. A., Mercado, F., & Carretié, L., 2015. N170 sensitivity to facial expression: A meta-analysis. Neuroscience & Biobehavioral Reviews, 55, 498–509. https://doi.org/10.1016/j.neubiorev.2015.06.002

RI PT

Holmes, A., Nielsen, M.K., Green, S., 2008. Effects of anxiety on the processing of fearful and happy faces: An event-related potential study. Biological Psychology 77 (2), 159–173. 10.1016/j.biopsycho.2007.10.003.

SC

Holmes, A., Vuilleumier, P., Eimer, M., 2003. The processing of emotional facial expression is gated by spatial attention: Evidence from event-related brain potentials. Cognitive Brain

M AN U

Research 16 (2), 174–184. 10.1016/S0926-6410(02)00268-9.

Huijgen, J., Dinkelacker, V., Lachat, F., Yahia-Cherif, L., El Karoui, I., Lemaréchal, J.-D., Adam, C., Hugueville, L., George, N., 2015. Amygdala processing of social cues from

10.1093/scan/nsv048.

TE D

faces: an intracrebral EEG study. Soc Cogn Affect Neurosci 10 (11), 1568–1576.

Ishai, A., 2008. Let’s face it: It’s a cortical network. NeuroImage 40 (2), 415–419. 10.1016/j.neuroimage.2007.10.040.

EP

Itier, R.J., Neath-Tavares, K.N., 2017. Effects of task demands on the early neural processing

AC C

of fearful and happy facial expressions. Brain Research 1663, 38–50. 10.1016/j.brainres.2017.03.013. Itier, R.J., Taylor, M.J., 2004. Source analysis of the N170 to faces and objects. Neuroreport 15 (8).

Kim, M.J., Mattek, A.M., Bennett, R.H., Solomon, K.M., Shin, J., Whalen, P.J., 2017. Human Amygdala Tracks a Feature-Based Valence Signal Embedded within the Facial Expression 30

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

of Surprise. The Journal of neuroscience : the official journal of the Society for Neuroscience 37 (39), 9510–9518. 10.1523/JNEUROSCI.1375-17.2017. Krolak-Salmon, P., Hénaff, M.-A., Vighetto, A., Bertrand, O., Mauguière, F., 2004. Early

42 (4), 665–676. 10.1016/S0896-6273(04)00264-8.

RI PT

Amygdala Reaction to Fear Spreading in Occipital, Temporal, and Frontal Cortex. Neuron

Latinus, M., & Taylor, M. J., 2006. Face processing stages: Impact of difficulty and the

https://doi.org/10.1016/j.brainres.2006.09.031

SC

separation of effects. Brain Research, 1123(1), 179–187.

M AN U

Lavie, N., Ro, T., Russell, C., 2003. The role of perceptual load in processing distractor faces. Psychol Sci 14 (5), 510–515. 10.1111/1467-9280.03453.

Lei, X., Valdes-Sosa, P.A., Yao, D., 2012. EEG/fMRI fusion based on independent component analysis: Integration of data-driven and model-driven methods. J. Integr.

TE D

Neurosci. 11 (03), 313–337. 10.1142/S0219635212500203.

Leppänen, J.M., Kauppinen, P., Peltola, M.J., Hietanen, J.K., 2007. Differential electrocortical responses to increasing intensities of fearful and happy emotional

EP

expressions. Brain Research 1166, 103–109. 10.1016/j.brainres.2007.06.060.

AC C

Lin, H., Mueller-Bardorff, M., Mothes-Lasch, M., Buff, C., Brinkmann, L., Miltner, Wolfgang H. R., Straube, T., 2016. Effects of Intensity of Facial Expressions on Amygdalar Activation Independently of Valence. Front. Hum. Neurosci. 10, 646. 10.3389/fnhum.2016.00646.

31

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Lindquist, K.A., Satpute, A.B., Wager, T.D., Weber, J., Barrett, L.F., 2016. The Brain Basis of Positive and Negative Affect: Evidence from a Meta-Analysis of the Human Neuroimaging Literature. Cerebral Cortex 26 (5), 1910–1922. 10.1093/cercor/bhv001.

study. Nat Neurosci 5 (9), 910–916. 10.1038/nn909.

RI PT

Liu, J., Harris, A., Kanwisher, N., 2002. Stages of processing in face perception: An MEG

Liu, J., Harris, A., Kanwisher, N., 2010. Perception of face parts and face configurations: An

SC

FMRI study. Journal of Cognitive Neuroscience 22 (1), 203–211. 10.1162/jocn.2009.21203.

M AN U

Maldjian, J.A., Laurienti, P.J., Kraft, R.A., Burdette, J.H., 2003. An automated method for neuroanatomic and cytoarchitectonic atlas-based interrogation of fMRI data sets. NeuroImage 19 (3), 1233–1239. 10.1016/S1053-8119(03)00169-1. Maris, E., Oostenveld, R., 2007. Nonparametric statistical testing of EEG- and MEG-data.

TE D

Journal of neuroscience methods 164 (1), 177–190. 10.1016/j.jneumeth.2007.03.024. Meletti, S., Cantalupo, G., Benuzzi, F., Mai, R., Tassi, L., Gasparini, E., Tassinari, C.A., Nichelli, P., 2012. Fear and happiness in the eyes: An intra-cerebral event-related potential

EP

study from the human amygdala. Neuropsychologia 50 (1), 44–54.

AC C

10.1016/j.neuropsychologia.2011.10.020. Mende-Siedlecki, P., Said, C.P., Todorov, A., 2013. The social evaluation of faces: A metaanalysis of functional neuroimaging studies. Soc Cogn Affect Neurosci 8 (3), 285–299. 10.1093/scan/nsr090.

32

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Méndez-Bértolo, C., Moratti, S., Toledano, R., Lopez-Sosa, F., Martínez-Alvarez, R., Mah, Y.H., Vuilleumier, P., Gil-Nagel, A., Strange, B.A., 2016. A fast pathway for fear in human amygdala. Nat Neurosci 19 (8), 1041–1049. 10.1038/nn.4324.

RI PT

Monroe, J.F., Griffin, M., Pinkham, A., Loughead, J., Gur, R.C., Roberts, T.P.L., Christopher Edgar, J., 2013. The fusiform response to faces: Explicit versus implicit processing of emotion. Hum. Brain Mapp. 34 (1), 1–11. 10.1002/hbm.21406.

SC

Mothes-Lasch, M., Mentzel, H.-J., Miltner, Wolfgang H. R., Straube, T., 2011. Visual

attention modulates brain activation to angry voices. The Journal of neuroscience : the

M AN U

official journal of the Society for Neuroscience 31 (26), 9594–9598. 10.1523/JNEUROSCI.6665-10.2011.

Muench, H.M., Westermann, S., Pizzagalli, D.A., Hofmann, S.G., Mueller, E.M., 2016. Selfrelevant threat contexts enhance early processing of fear-conditioned faces. Biological

TE D

Psychology 121, 194–202. 10.1016/j.biopsycho.2016.07.017.

Mulert, C., Seifert, C., Leicht, G., Kirsch, V., Ertl, M., Karch, S., Moosmann, M., Lutz, J., Möller, H.-J., Hegerl, U., Pogarell, O., Jäger, L., 2008. Single-trial coupling of EEG and

EP

fMRI reveals the involvement of early anterior cingulate cortex activation in effortful

AC C

decision making. NeuroImage 42 (1), 158–168. 10.1016/j.neuroimage.2008.04.236. Müller, M.M., Andersen, S.K., Keil, A., 2008. Time course of competition for visual processing resources between emotional pictures and foreground task. Cerebral Cortex 18 (8), 1892–1899. 10.1093/cercor/bhm215.

33

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Müller, M.M., Hübner, R., 2002. Can the spotlight of attention be shaped like a doughnut? Evidence from steady-state visual evoked potentials. Psychol Sci 13 (2), 119–124. 10.1111/1467-9280.00422.

RI PT

Müller-Bardorff, M., Schulz, C., Peterburs, J., Bruchmann, M., Mothes-Lasch, M., Miltner, W., Straube, T., 2016. Effects of emotional intensity under perceptual load: An eventrelated potentials (ERPs) study. Biological Psychology 117, 141–149.

SC

10.1016/j.biopsycho.2016.03.006.

Murray, R.J., Brosch, T., Sander, D., 2014. The functional profile of the human amygdala in

10.1016/j.cortex.2014.06.010.

M AN U

affective processing: Insights from intracranial recordings. Cortex 60, 10–33.

Nguyen, V.T., Cunnington, R., 2014. The superior temporal sulcus and the N170 during face processing: Single trial analysis of concurrent EEG–fMRI. NeuroImage 86, 492–502.

TE D

10.1016/j.neuroimage.2013.10.047.

Olofsson, J.K., Nordin, S., Sequeira, H., Polich, J., 2008. Affective picture processing: An integrative review of ERP findings. Biological Psychology 77 (3), 247–265.

EP

10.1016/j.biopsycho.2007.11.006.

AC C

Oostenveld, R., Fries, P., Maris, E., Schoffelen, J.-M., 2011. FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data. Computational intelligence and neuroscience 2011, 156869. 10.1155/2011/156869. Ousdal, O.T., Reckless, G.E., Server, A., Andreassen, O.A., Jensen, J., 2012. Effect of relevance on amygdala activation and association with the ventral striatum. NeuroImage 62 (1), 95–101. 10.1016/j.neuroimage.2012.04.035. 34

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Oya, H., Kawasaki, H., Howard, M.A., Adolphs, R., 2002. Electrophysiological responses in the human amygdala discriminate emotion categories of complex visual stimuli. The

9502–9512.

RI PT

Journal of neuroscience : the official journal of the Society for Neuroscience 22 (21),

Pessoa, L., 2010. Emotion and attention effects: is it all a matter of timing? Not yet. Front. Hum. Neurosci. 4. 10.3389/fnhum.2010.00172.

SC

Pessoa, L., Adolphs, R., 2010. Emotion processing and the amygdala: from a 'low road' to

10.1038/nrn2920.

M AN U

'many roads' of evaluating biological significance. Nat Rev Neurosci 11 (11), 773–783.

Pessoa, L., McKenna, M., Gutierrez, E., Ungerleider, L.G., 2002. Neural processing of emotional faces requires attention. Proc Natl Acad Sci USA 99 (17), 11458–11463. 10.1073/pnas.172403899.

TE D

Phelps, E.A., LeDoux, J.E., 2005. Contributions of the Amygdala to Emotion Processing: From Animal Models to Human Behavior. Neuron 48 (2), 175–187. 10.1016/j.neuron.2005.09.025.

EP

Pitcher, D., Walsh, V., Duchaine, B., 2011. The role of the occipital face area in the cortical

AC C

face perception network. Exp Brain Res 209 (4), 481–493. 10.1007/s00221-011-2579-1. Pitcher, D., 2014. Facial Expression Recognition Takes Longer in the Posterior Superior Temporal Sulcus than in the Occipital Face Area. Journal of Neuroscience 34 (27), 9173– 9177. 10.1523/JNEUROSCI.5038-13.2014.

35

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Pitcher, D., Walsh, V., Yovel, G., Duchaine, B., 2007. TMS Evidence for the Involvement of the Right Occipital Face Area in Early Face Processing. Current Biology 17 (18), 1568– 1573. 10.1016/j.cub.2007.07.063.

RI PT

Pizzagalli, D., Regard, M., Lehmann, D., 1999. Rapid emotional face processing in the human right and left brain hemispheres: An ERP study. Neuroreport 10 (13), 2691–2698. Pourtois, G., Dan, E.S., Grandjean, D., Sander, D., Vuilleumier, P., 2005. Enhanced

SC

extrastriate visual response to bandpass spatial frequency filtered fearful faces: Time

10.1002/hbm.20130.

M AN U

course and topographic evoked-potentials mapping. Hum. Brain Mapp. 26 (1), 65–79.

Pourtois, G., Spinelli, L., Seeck, M., Vuilleumier, P., 2010. Temporal precedence of emotion over attention modulations in the lateral amygdala: Intracranial ERP evidence from a patient with temporal lobe epilepsy. Cognitive, Affective, & Behavioral Neuroscience 10

TE D

(1), 83–93. 10.3758/CABN.10.1.83.

Quiroga, R.Q., Garcia, H., 2003. Single-trial event-related potentials with wavelet denoising. Clinical Neurophysiology 114 (2), 376–390. 10.1016/S1388-2457(02)00365-6.

EP

Reinl, M., Bartels, A., 2014. Face processing regions are sensitive to distinct aspects of

AC C

temporal sequence in facial dynamics. NeuroImage 102 Pt 2, 407–415. 10.1016/j.neuroimage.2014.08.011. Rotshtein, P., Richardson, M.P., Winston, J.S., Kiebel, S.J., Vuilleumier, P., Eimer, M., Driver, J., Dolan, R.J., 2010. Amygdala damage affects event-related potentials for fearful faces at specific time windows. Hum. Brain Mapp. 31 (7), 1089–1105. 10.1002/hbm.20921. 36

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Rutishauser, U., Mamelak, A.N., Adolphs, R., 2015. The primate amygdala in social perception - insights from electrophysiological recordings and stimulation. Trends in Neurosciences 38 (5), 295–306. 10.1016/j.tins.2015.03.001.

RI PT

Sabatinelli D, Lang PJ, Keil A, Bradley MM, 2007. Emotional perception: correlation of functional MRI and event-related potentials. Cereb Cortex 17:1085–1091.

doi:10.1093/cercor/bhl017.Salimi-Khorshidi, G., Smith, S.M., Nichols, T.E., 2011.

SC

Adjusting the effect of nonstationarity in cluster-based and TFCE inference. NeuroImage 54 (3), 2006–2019. 10.1016/j.neuroimage.2010.09.088.

M AN U

Santos, A., Mier, D., Kirsch, P., Meyer-Lindenberg, A., 2011. Evidence for a general face salience signal in human amygdala. NeuroImage 54 (4), 3111–3116. 10.1016/j.neuroimage.2010.11.024.

Schönwald, L.I., Müller, M.M., 2014. Slow biasing of processing resources in early visual

TE D

cortex is preceded by emotional cue extraction in emotion-attention competition. Hum. Brain Mapp. 35 (4), 1477–1490. 10.1002/hbm.22267. Schupp, H.T., Ohman, A., Junghöfer, M., Weike, A.I., Stockburger, J., Hamm, A.O., 2004.

EP

The facilitated processing of threatening faces: An ERP analysis. Emotion 4 (2), 189–200.

AC C

10.1037/1528-3542.4.2.189.

Smith, S.M., Nichols, T.E., 2009. Threshold-free cluster enhancement: Addressing problems of smoothing, threshold dependence and localisation in cluster inference. NeuroImage 44 (1), 83–98. 10.1016/j.neuroimage.2008.03.061. Steeves, J.K.E., Culham, J.C., Duchaine, B.C., Pratesi, C.C., Valyear, K.F., Schindler, I., Humphrey, G.K., Milner, A.D., Goodale, M.A., 2006. The fusiform face area is not 37

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

sufficient for face recognition: Evidence from a patient with dense prosopagnosia and no occipital face area. Neuropsychologia 44 (4), 594–609. 10.1016/j.neuropsychologia.2005.06.013.

RI PT

Sugase, Y., Yamane, S., Ueno, S., Kawano, K., 1999. Global and fine information coded by single neurons in the temporal visual cortex. Nature 400 (6747), 869–873. 10.1038/23703. Trautmann-Lengsfeld, S.A., Domínguez-Borràs, J., Escera, C., Herrmann, M., and Fehr, T.,

SC

2013. The Perception of Dynamic and Static Facial Expressions of Happiness and Disgust Investigated by ERPs and FMRI Constrained Source Analysis. PLOS ONE 8, no. 6 (June

M AN U

20, 2013): e66997. https://doi.org/10.1371/journal.pone.0066997.

van der Gaag, Christiaan, Minderaa, R.B., Keysers, C., 2007. The BOLD signal in the amygdala does not differentiate between dynamic facial expressions. Soc Cogn Affect Neurosci 2 (2), 93–103. 10.1093/scan/nsm002.

TE D

Vos, M. de, Zink, R., Hunyadi, B., Mijovic, B., van Huffel, S., Debener, S., 2013. The quest for single trial correlations in multimodal EEG-fMRI data, 6027–6030. 10.1109/EMBC.2013.6610926.

EP

Vuilleumier, P., Driver, J., 2007. Modulation of visual processing by attention and emotion:

AC C

Windows on causal interactions between human brain regions. Philosophical transactions of the Royal Society of London. Series B, Biological sciences 362 (1481), 837–855. 10.1098/rstb.2007.2092. Vuilleumier, P., Pourtois, G., 2007. Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia 45 (1), 174–194. 10.1016/j.neuropsychologia.2006.06.003. 38

EARLY BRAIN RESPONSES TO EMOTIONAL FACIAL EXPRESSIONS ACCEPTED MANUSCRIPT

Whalen, P.J., Raila, H., Bennett, R., Mattek, A., Brown, A., Taylor, J., van Tieghem, M., Tanner, A., Miner, M., Palmer, A., 2013. Neuroscience and Facial Expressions of Emotion: The Role of Amygdala–Prefrontal Interactions. Emotion Review 5 (1), 78–83.

RI PT

10.1177/1754073912457231.

Winkler, A.M., Ridgway, G.R., Webster, M.A., Smith, S.M., Nichols, T.E., 2014. Permutation inference for the general linear model. NeuroImage 92, 381–397.

SC

10.1016/j.neuroimage.2014.01.060.

Wirsich, J., Bénar, C., Ranjeva, J.-P., Descoins, M., Soulier, E., Le Troter, A., Confort-

M AN U

Gouny, S., Liégeois-Chauvel, C., Guye, M., 2014. Single-trial EEG-informed fMRI reveals spatial dependency of BOLD signal on early and late IC-ERP amplitudes during face recognition. NeuroImage 100, 325–336. 10.1016/j.neuroimage.2014.05.075. Woldorff, M.G., Liotti, M., Seabolt, M., Busse, L., Lancaster, J.L., Fox, P.T., 2002. The

TE D

temporal dynamics of the effects in occipital cortex of visual-spatial selective attention. Cognitive Brain Research 15 (1), 1–15. 10.1016/S0926-6410(02)00212-4. Yang, J., Bellgowan, P.S.F., Martin, A., 2012. Threat, domain-specificity and the human

EP

amygdala. Neuropsychologia 50 (11), 2566–2572.

AC C

10.1016/j.neuropsychologia.2012.07.001.

39