Early averted gaze processing in the right Fusiform Gyrus: An EEG source imaging study

Early averted gaze processing in the right Fusiform Gyrus: An EEG source imaging study

Accepted Manuscript Title: Early averted gaze processing in the right Fusiform Gyrus: An EEG source imaging study Author: Cristina Berchio Tonia A. Ri...

4MB Sizes 0 Downloads 79 Views

Accepted Manuscript Title: Early averted gaze processing in the right Fusiform Gyrus: An EEG source imaging study Author: Cristina Berchio Tonia A. Rihs Camille Piguet Alexandre G. Dayer Jean-Michel Aubry Christoph M. Michel PII: DOI: Reference:

S0301-0511(16)30199-5 http://dx.doi.org/doi:10.1016/j.biopsycho.2016.06.008 BIOPSY 7224

To appear in: Received date: Revised date: Accepted date:

12-9-2015 21-6-2016 22-6-2016

Please cite this article as: Berchio, Cristina, Rihs, Tonia A., Piguet, Camille, Dayer, Alexandre G., Aubry, Jean-Michel, Michel, Christoph M., Early averted gaze processing in the right Fusiform Gyrus: An EEG source imaging study.Biological Psychology http://dx.doi.org/10.1016/j.biopsycho.2016.06.008 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Early averted gaze processing in the right Fusiform Gyrus: an EEG source imaging study Cristina Berchio* 1, Tonia A. Rihs 1, Camille Piguet 1, 2, Alexandre G. Dayer 1, 2, Jean-Michel Aubry 2, Christoph M. Michel 1,3

*

(1) Department of Fundamental Neurosciences, University of Geneva, Geneva, Switzerland

(2) Department of Mental Health and Psychiatry, University Hospitals of Geneva, Switzerland

(3) Center for Biomedical Imaging (CIBM), Lausanne and Geneva, Switzerland

*Correspondence:

Cristina Berchio, PhD Functional Brain Mapping Laboratory Dept. of Fundamental Neurosciences University Medical School Campus Biotech, Chemin des Mines 9, 1202 Geneva, Switzerland

*[email protected]

1

Highlights ► We investigated the spatio-temporal dynamics of implicit gaze perception ► Faces with averted gaze were detected more accurately than faces with direct gaze. ►Averted gaze induced an increase in the P100 amplitude ►Early gaze perception activated the right fusiform gyrus, and the orbitofrontal cortex. ►Gaze processing may affect behavior in an implicit way. ►This study shed light on a potential link between early top-down cortical effects and implicitly gaze recognition

2

Abstract

Humans are able to categorize face properties with impressively short latencies. Nevertheless, the latency at which gaze recognition occurs is still a matter of debate. Through spatiotemporal analysis of high-density event-related potentials (ERP), we investigated the brain activity underlying the ability to spontaneously and quickly process gaze. We presented neutral faces with direct and averted gaze in a matching picture paradigm, where subjects had to detect repetition of identical faces and gaze was implicitly manipulated. The results indicate that faces with averted gaze were better discriminated than faces with direct gaze, and evoked stronger P100 amplitudes localized to the right fusiform gyrus. In contrast, direct gaze induced stronger activation in the orbital frontal gyrus at this latency. Later in time, at the beginning of the N170 component, direct gaze induced changes in scalp topography with a stronger activation in the right medial temporal gyrus. The location of these differential activations of direct vs. averted gaze further support the view that faces with averted gaze are perceived as less rewarding than faces with direct gaze. We additionally found differential ERP responses between repeated and novel faces as early as 50 ms, thereby replicating earlier studies of very fast detection of mnestic aspects of stimuli. Together, these results suggest an early dissociation between implicit gaze detection and explicit identity processing.

Keywords:

Gaze processing, face memory, Evoked Potentials, topographic EEG analyses, EEG source imaging 3

1. Introduction

From birth, humans are predisposed to detect biological motion and social stimuli (Simion et al., 2011), and gaze direction in particular (Farroni et al. 2002). Gaze detection has important cognitive and emotional implications for our social life (George & Conty, 2008; Itier & Batty, 2009). There is evidence that direct gaze enhances cognitive processing, such as memory for faces (Hood et al., 2003; Vuilleumier, et al., 2005; Nakashima et al, 2012; Sessa & Dalmaso, 2015), and attention (Frischen et al., 2007). Moreover, gaze can communicate emotions: previous studies have shown that direct gaze augments the perception of approach-oriented emotions (i.e. anger), while averted eye gaze rather evokes avoidance-oriented emotions (i.e. fear) (Adams & Kleck, 2003a,b, Adams & Kleck, 2005, Adams et al., 2005; Sander et al., 2007; Adams et al., 2012; Benton, 2010). Interestingly, gaze direction affects also facial attractiveness, and it has been shown that faces with direct gaze are perceived as more attractive than faces with averted gaze (Strick & Holland, 2008). Furthermore, several fMRI studies showed that viewing direct and averted gaze involves similar brain networks than those involved in the processing of emotional faces (see George et al., 2001; Hoffman & Haxby, 2000; Adams & Kleck, 2003b; Adams et al., 2012). Since several studies have shown that perceived direct gaze enhanced cognitive and emotional processing, it has been proposed that direct gaze is quickly detected by a specific subcortical route that directly affects cortical areas (see Senju, Johnson, 2009). Nevertheless, the contradictory findings on the emotional and cognitive impact of direct and averted gaze

4

preclude to draw any final conclusion about the role of gaze direction in face processing and their neural substrates (see Itier & Batty, 2009; Carlin & Calder, 2013). Few studies have investigated neutral gaze processing. Interestingly, neutral faces have been shown to convey emotional information (see Lee et al., 2008), and gaze is considered an informative emotional stimulus, even in faces with neutral expressions (Adams & Kleck, 2003a). However, this effect is more pronounced in psychiatric conditions characterized by emotional dysregulation where enhanced activation patterns were found in response to neutral faces (i.e. Leppänen et al., 2004; Hall et al., 2008; Brotman et al., 2010), and neutral gaze (Schimtz et al., 2012). Thus, exploring the processing of neutral stimuli should shed more light into the basic mechanisms behind face recognition based on gaze. With respect to the temporal dynamics of gaze processing, the latency at which gaze detection occurs is still a matter of debate (see Itier & Batty, 2009). Traditionally, Event Related Potentials (ERP) studies have shown that face perception induces a specific occipito-temporal response peaking around 170 ms (N170) (Bentin et al., 1996; George et al., 1996, Rossion et al., 1999). This component is also sensitive to eyes segregated from faces (i.e. Bentin et al., 1996; Taylor et al., 2001; Itier et al., 2006; Rousselet et al., 2014). Nevertheless, although several EEG studies have found modulation of the N170 by gaze direction (i.e. Watanabe et al. 2002; Itier et al., 2007; Conty et al., 2007), others have failed to find significant effects (Klucharev & Sams, 2004; Grice et al., 2005; Schweinberger et al., 2007). Furthermore, there are contradictory suggestions regarding the brain sources of the N170 component during face processing (see Itier & Batty 2009): some EEG studies have demonstrated fusiform gyrus activation (Deffke et al., 2007; Dalrymple et al., 2011; Sadeh et al., 2010), others have found activation in the superior temporal sulcus (STS) (i.e. Conty et al., 2007, Nguyen & Cunnington, 2014). Furthermore, magnetoencephalography (MEG) studies have provided 5

evidence for the involvement of occipital-temporal regions related to the M170 (face-selective MEG component, see Smith et al., 2009; Prieto et al., 2011). Finally, intracranial EEG recordings provided evidence for posterior occipital and temporal lateral responses to faces (Engell & McCarthy, 2011; Jonas et al., 2014). Gaze direction also seems to have an impact on the P100 ERP component (see Doi et al., 2007; Schyns et al., 2007). Initially, the P100 was thought to originate mainly from primary and secondary visual cortex (Haimovic & Pedley, 1982; Ducati et al., 1988). However, recent studies support the view that affective stimuli are able to modulate the P100 amplitude, thus pointing to the involvement of a more extended brain network that generates the P100 (Pizzagalli et al., 1999; Batty, Taylor, 2003; Eger et al., 2003; Streit et al., 2003; and see Vuilleumier & Pourtois, 2007, Vuillemieur, 2015). Memory processing is a powerful tool to investigate face recognition (see Adolphs, 2002), and it is also suited to clarify unsolved questions related to gaze processing. Behavioral studies, with the focus on face identity categorization, have reported better memory performance for faces with direct gaze than averted gaze (Hood et al., 2003; Nakashima et al, 2012). In a recent study, Artuso and coworkers (2012) have shown evidence of an emotion and gaze interaction on working memory (WM): in this study the participants recognized congruent combinations of gaze and emotion faster (i.e. joy with direct gaze and fear with averted gaze) than incongruent combinations. However, at the best of our knowledge, there have been no studies that investigated the spatial-temporal dynamics of gaze processing during an implicit face recognition task. In this high density EEG study we aimed to examine whether, when and where in the brain gaze direction modulates face recognition. Due to the high ecological and social values of eye gaze, we presented neutral faces with direct and averted gaze in a picture-matching paradigm. 6

Given the emotional properties of direct and averted gaze (Adams and Kleck, 2003 a,b, Adams and Kleck, 2005, Adams et al., 2005; Sander et al., 2007; Adams et al., 2012; Benton, 2010), the investigation of neutral gaze processing during a picture matching task allows to clarify the interaction between gaze and emotion on face recognition. Instead of requiring the subjects to discriminate gaze (which might produce the activation of intentional strategies), the participants focused on the ‗implicit gaze‘ memory task allowing us to examine the spontaneous effects that gaze direction can produce during recognition of face images. We used topographic analyses and EEG source imaging methods (Michel & Murray, 2012) in order to characterize the spatio-temporal neuronal dynamics of face processing at the level of electrical field maps and their sources in the brain (Pourtois et al., 2008; Rellecke et al., 2013).

2. Material and Methods

2.1 Participants Nineteen right-handed volunteers participated in this study; data from five participants were excluded due to excessive EEG artifacts. The final sample included 9 males and 5 females (Age: median=24.5; min=19, max=31). All participants were caucasian and with a graduate university level. For inclusion into the study, participants were required to be in good physical health with no current medical illnesses, to be medication-free, to have no psychiatric or neurological history and to be right-handed.

7

The study protocol was approved by the Ethical Committee for Human Research of the Geneva University Hospital, Switzerland.

2.2 Stimuli

A total of 67 different faces with a neutral expression were selected from the Radboud Faces Database (Langner et al., 2011), the database kindly provided by Dr. Nathalie George (George et al., 2001), the NIMH-chEFS Picture Set (Egger et al., 2011), and from the Amsterdam Dynamic Facial Expression Set (ADFES - still pictures - van der Schalk et al., 2011). Faces were presented with direct gaze (34 stimuli), and averted gaze (33 stimuli): 17 actors had right averted gaze, 16 left averted gaze. With the aim to increase the number of different actors, and to avoid any mnemonic bias, we decided to include multiple databases. The stimuli were balanced for gender and gaze. Each identity was presented approximately four times during the experiment. In order to normalize the stimuli obtained from different databases, they were processed with Adobe Photoshop CS5 software (version 12.0 x 64): the pictures were cropped to the central portion of the face, digitalized in gray scale, regularized for luminosity and contrast and transformed to have the same size (400x595 pixels). All stimuli were presented on a gray background (RGB: 192, 192, 192).

2.3 Paradigm

8

All subjects completed a picture matching task in which images of neutral faces with either direct or averted gaze were equally distributed (Figure 1).

(Please insert Figure 1 around here)

The repeated faces were strictly identical (identity and gaze direction). Faces were presented for 1000 ms and the interval between faces was 2000 ms (Figure 2).

(Please insert Figure 2 around here)

The recording session comprised three 6-min blocks presented in a pseudo-randomized order. Faces were presented pseudo-randomly with a total of 60 targets for the direct gaze condition (60 ‗new‘ and 60 ‗repeated‘), and 60 targets for the averted gaze condition (60 ‗new‘ and 60 ‗repeated‘): with a ratio of target faces (‗new‘, ‗repeated‘) to non-target faces (distractor) of 40:60 per block. Participants were required to press a key labelled ‗same‘ (down arrow) if the face was exactly the same as the face presented two faces before, or a key labelled ‗different‘ (up arrow) if the face was different from the face presented two faces before. Participants were asked to use the middle finger of their right hand and to respond as quickly as possible. In order to avoid intentional strategies (see George et al., 2001) and to examine the spontaneous effects that gaze direction can produce during recognition of images of faces, there was no indication about gaze in the task instructions. 9

Before the experimental session, participants were required to practice 10 trials (the stimuli presented during the training were not used for the experimental session). With the aim to investigate the subjects‘ judgments of the presented faces, after the EEG session, participants were asked to rate the perceived emotional disposition of the face. 10 faces (5 with averted gaze, 5 with direct gaze) were selected randomly from the 2-back task, and stimuli were rated on two 7-point Likert scales (from 0= not at all to 7 =extremely): a) ‗‗How hostile is the face?‘‘; b) ‗‗How fearful is the face?‘‘. Each face was displayed for 3000 ms, followed by a 2000 ms inter-trial interval in which participants were requested to answer as quickly and as accurately as possible. All visual stimuli were displayed on E-prime (2.0), with a distance from the screen to the viewer of 60 cm.

2.4 Behavioral analysis: accuracy, reaction time, rating

Behavioral performance was evaluated in terms of accuracy and reaction time. Signal detection theory indices were used to investigate behavioral accuracy (see Green & Swets, 1966; Macmillan & Creelman, 1990). For each participant and condition we calculated: a) Hit Rate (HR) (matching trials for which participants correctly responded ‗‗yes‘‘) b) Miss Rate (MR) c) False Alarm Rate (FAR; non-matching trials on which participants incorrectly responded ‗‗yes‘‘) d) Correct Rejection Rate (CRR). HR, MR, FAR, CRR were used to calculate the d prime (d‘) (standard errors and confidence limits according to formulae given in Macmillan and Creelman, 2004). The d‘ measure refers to how difficult or easy it is to detect any target stimulus presented in a background of events. For example, in 10

our task, higher d‘ values for ‗new‘ faces indicate a better sensitivity to differentiate ‗new‘ faces from ‗repeated‘ faces; higher d‘ values for direct gaze indicate a better sensitivity to discriminate faces with direct than averted gaze. The significance of Hit Rate (HR), d‘ values and False Alarms (FA) were tested using nonparametric Friedman ANOVA. Post hoc analyses with Wilcoxon signed-rank tests were conducted with Bonferroni correction. Median reaction times (RTs) were analyzed for HR trials. A repeated measures ANOVA was performed on RTs, with Gaze (direct vs. averted), memory Load (‗new‘ vs. ‗repeated‘) as within subject factors. Behavioural ratings for the stimulus faces were examined using a repeated-measures ANOVA, with Gaze (direct vs. averted) and Emotion (hostile vs. afraid) as within subject factors. Alpha levels were set to p<0.05 on all ANOVA‘s and adjustments for multiple comparisons (Bonferroni) were applied on all statistically significant effects and interactions.

2.5 EEG data acquisition and ERP pre-processing

The EEG was acquired with a 256-channel system (Electrical Geodesics Inc.) in a soundisolated Faraday cage, using a sampling rate of 1000 Hz, with Cz as recording reference. Impedances were kept below 30 k The ERP‘s were pre-processed with the Cartool 3.53 software by Denis Brunet (http://www.fbmlab.com/cartool-software/). Data were band-pass filtered between 0.3 Hz and

11

40 Hz using a noncausal filter (2nd order butterworth Low and High pass, - 12 db/octave rolloff, computed linearly forward and backward, eliminating the phase shift, and with poles calculated each time to the desired cut-off frequency). For the early differential effects observed in this study, additional analysis was performed with a causal filter with similar properties to avoid back-projection of later effects on earlier time points (Rousselet, 2012).

Epochs contaminated by eye and other artifacts were rejected by visual inspection. Bad channels were interpolated using the 3D spline interpolation method. The EEG was then segmented into epochs ranging from -100 ms before stimulus onset to 600 ms after stimulus onset and separately averaged for the gaze: ‗new trials‘ (direct vs. averted gaze) and ‗repeated trials‘ (direct vs. averted gaze) (for trial description see Figure 2). The ERP data were recomputed to the average reference. ERP‘s were analyzed on correct trials only (―new‖ direct gaze condition: median =44, min=26, max=51; ‗new‘ averted gaze condition: median=41.5, min=26, max=51; ‗repeated‘ direct gaze condition: median=43.5, min=29, max=48; ‗repeated‘ averted gaze condition: median=42.5, min=29, max=52). Independent two tailed t tests conducted on the accepted number of trials for each participant showed no significant differences in trial rejections between conditions (all ps >0.3). For subsequent analyses, peripheral channels located on the cheek and in the nape were excluded leading to a reduction from 256 to 204 channels (see Britz & Pitts 2010; Berchio et al. 2013).

2.6 ERP analysis

12

We analyzed the high-density ERPs using a comprehensive multivariate spatial analysis approach that gives statistical information about the latencies and amplitudes of responses between conditions, and the brain areas that were involved (for reviews see: Murray et al., 2008; Murray et al., 2009; Michel et al., 2009; Michel and Murray, 2010, Brunet et al., 2011.

Analysis on the scalp level

We investigated modulations of response amplitudes by computing ANOVAs with Gaze (direct vs. averted) and Load (‗new‘ vs. ‗repeated‘) as within subject factors for each electrode and each time point from 0 to 400 ms post-stimulus. We attempted to reduce the problem of multiple comparisons by permutation and bootstrap statistics (see Maris & Oostenvald, 1997; Pernet et al., 2015) and by only considering effects that were significant for a certain period of time. We performed a repeated measures non-parametric ANOVA, with a bootstrapping of the subjects, and permutation of the within subjects factors (for technical details, see Knebel et al., 2013). The number of cycles was set to 1000, and the p values to p<0.05. Effects were considered statistically significant when they lasted for consecutive time frames of at least 10 ms (Michel et al., 2009; Murray et al., 2008). This analysis was performed

using

the

STEN

toolbox

developed

by

Jean-François

Knebel

(http://www.unil.ch/line/Sten). In addition to the ANOVA for each electrode, two global tests across all electrodes were applied, one for testing significant differences of the topographic configuration of the maps between conditions, and the other to test for global field power differences. Differences of map configurations were evaluated by a global map topography test, called 13

‗topographic ANOVA‘ or TANOVA (see Murray et al., 2008, Michel et al., 2009; Michel and Murray, 2012 Koenig et al., 2011). It consists of a non-parametric randomization test on the global map dissimilarity (GMD) measure between two maps (Karniski et al., 1994; Srebro, 1996). The GMD is a reference-independent measure of topographic differences of two scalp potential maps. It is calculated as the square root of the mean of the squared differences between the potentials measured at each electrode (vs. the average reference), each of which is first scaled to unitary strength by dividing by the instantaneous global field power (Lehmann & Skrandies, 1980, Michel et al., 2011, Koenig et al., 2011; Michel & Murray 2012). The GMD is equivalent to the spatial Pearson's product-moment correlation coefficient between the potentials of the two maps to compare (Brandeis et al, 1992). The important difference to a standard ANOVA across electrodes with permutation testing is the strengthnormalization of the two maps to be compared, so that only topographic differences are considered. If two maps differ in topography independent of their strength, this directly indicates that the two maps were generated by a different configuration of sources in the brain (Vaughan, 1982; Lehmann, 1987; Srebro, 1996). The test for statistical significant difference of topography between the experimental conditions (the topographic ANOVA) is done in the following way: 1) assigning the maps of each single subject randomly to one of the conditions (i.e. permutations of the data), 2) recalculating the group-average ERPs, and 3) recalculating the resulting GMD value for these ‗new‘ group-average ERPs. This procedure is repeated many times and the probability that the GMD of the real data lies significantly outside of the distribution of the randomized data is calculated, for each time point (see Koenig et al., 2011). In our data we used 1000 randomization, a threshold of p< 0.05, and a time constraint of ≥ 10ms of successive significant tests (for multiple comparison correction, see above). As in the ANOVA for each electrode, we applied the TANOVA to test the data for main effects, and 14

interactions, i.e. a 2×2 design, with Gaze (direct vs. averted) and Load (‗new‘ vs. ‗repeated‘) as within subjects factors was used. The TANOVA analysis was performed using the software RAGU (Randomization Graphical User Interface; Koenig et al., 2011; Koenig and MelieGarcia, 2009, 2010). With the aim to test whether any TANOVA effects were stable and consistent across time points, the TANOVA was re-computed over the significant averaged time windows (see Koenig et al., 2011). Differences of map strength were evaluated using the Global Field Power (GFP) measure. The GFP corresponds to a reference independent single measure of momentary field strength, and is defined as the spatial standard deviation of the scalp electrical field (Lehmann & Skrandies, 1980). GFP differences were also evaluated in a 2×2 design with randomization tests, with Gaze (direct vs. averted) and Load (‗new‘ vs ‗repeated‘) as within subjects factors using the RAGU software as above. Similarly, effects were considered statistically significant only if they lasted for consecutive time frames of at least 10 ms.

Analysis in the source space

Analyses in the source space were performed after applying a linear distributed inverse solution based on local autoregressive averages to the ERPs of each subject and condition (LAURA, Grave de Peralta Menendez et al., 2001). An anatomically constrained head model was used (L-SMAC model, Brunet et al. 2011; Spinelli et al., 2000; Birot et al., 2014) based on the gray matter of the template brain of the Montreal Neurological Institute (http://www.bic.mni.mcgill.ca/brainweb) with a total of 5018 solution points.

15

The solution space was divided into pre-determined regions of interest using the automated anatomical labeling template (AAL, Tzourio-Mazoyer et al., 2002) (seven subcortical structures and the cerebellum were excluded). The mean estimated current density (CD) of all solution points within each ROI was computed. The analysis in the inverse space was performed separately to compare averted and direct gaze (averaged across repeated and new stimuli) and to compare repeated and new stimuli (averaged across direct and averted gaze) and was restricted to the significant time periods determined by the sensor-space analysis described above. Within these time periods, the current density values of each ROI was averaged and then subjected to a randomization test (10000 permutations, p-values less than 0.05). Furthermore, for each significant difference, contrast directions were identified with a paired t test (p<0.05). The magnitude of the effects found were evaluated with Cohen‘s d effect size index, defined as the difference between the mean of each condition, and divided by the standard deviation.

3. Results

3.1 Behavioral Results

Non-parametric Friedman ANOVAs on the Hit Rates [χ2(2) = 20.622, p = 0.000] revealed significantly higher HR for ‗new‘ faces with averted gaze as compared to faces with direct gaze (Z = -3.306, p = 0.001). However, only marginally significant differences were found between ‗repeated‘ faces (direct gaze vs. averted gaze, Z = -1.960, p = 0.050), and between 16

‗new‘ faces (averted gaze vs. direct gaze, Z = -2.357, p = 0.018; Bonferroni correction was applied with the significance level set at p < 0.0125) (see Figure 3 A). With respect to the performance expressed by d‘ values, Friedman ANOVAs [χ2(2) = 18.103, p = 0.000] revealed that participants had higher ability to discriminate ‗new‘ faces with averted gaze, than ‗new‘ faces with direct gaze (Z = -3.107, p = 0.002). Furthermore, subjects were better in discriminating direct gaze for ‗new‘ faces than for ‗repeated‘ faces (Z = 2.794, p = 0.005). Moreover, d‘ scores between ‗repeated‘ faces were marginally significant (averted gaze vs. direct gaze; Z = -2.166, p = 0.03; Bonferroni correction applied, significance level set at p < 0.0125) (see Figure 4 A). Friedman ANOVA‘s confirmed statistically significant differences for False Alarms ( χ2(2) = 22.386, p = 0.000): faces with direct gaze had higher FA for ‗new‘ faces than for ‗repeated‘ faces (Z = -2.910, p = 0.004). Similarly, when averted gaze faces were targets, FA were made more often to ‗new‘ than to ‗repeated‘ faces (Z = -2.902, p = 0.004). More importantly, no significant FA effects were found between faces with direct gaze and faces with averted gaze (‗new‘ conditions = p>0.05, ‗repeated‘ conditions = p>0.05; Bonferroni correction applied, significance level set at p < 0.0125). Taken together, these results show that the increased HR, and d‘ values seen when the faces had averted gaze cannot be attributed to FA bias (see Figure 3 B).

(Please insert Figure 3 around here) (Please insert Figure 4 around here)

17

A repeated measure ANOVA on median RT‘s (and Bonferroni's post hoc test) revealed a main effect of Load (F(1, 13)=4.8859, p=.04562), and a main effect of Gaze (F(1, 13)=5.0000, p=.04351). As shown in figure 4 (B), participants recognized repeated stimuli faster (p= 0.0456). Moreover, RT‘s were slower for faces with averted gaze (mean median: 1386 ms) than for faces with direct gaze (mean median: 1332 ms) (p= 0.0435). Finally, a repeated measures ANOVA compared the rating scores of the evaluation scale. On examination of the behavioral rating, no significant main effects or interactions were observed (all ps>1.8) [Hostile: direct gaze (mean=3.26, SD=1.09); averted gaze (mean=3.37, SD=1.31); Fearful: direct gaze (mean=3.03, SD=0.87); averted gaze (mean=2.83, SD=0.90)]. This analysis revealed that the judgment of the emotional disposition of the neutral face was not significantly modulated by gaze. In summary, the behavioral results indicate that the presence of averted gaze, in comparison to direct gaze, facilitated non-match and match decisions: faces with averted gaze are discriminated better from faces with direct gaze and remembered better. Furthermore, no effect of gaze was found on the FAR, and this measure supports the absence of a response bias for neither direct nor averted gaze. The reaction time results indicate faster responses for faces with direct gaze. Moreover, the behavioral results indicate that the repetition effect improved face detection, shown by higher d‘ values, lower FA and faster RTs. The questionnaire results indicated that gaze direction didn‘t influence the perceived emotional disposition. However, these results seem to indicate a slight shift towards negative attribution for averted gaze.

3.2 Evoked Potential Results 18

Single electrode amplitude analysis

In the first 400 ms, recognition of images of neutral faces elicited 3 components: the P100, N170 and P200 (Figure 5, A, B). Waveform data were statistically assessed via a 2×2 repeated measures non-parametric ANOVA on all electrodes. This ANOVA revealed a main effect of Gaze at 50-75, 110-140, 170-190, 290-340, and 360-400 ms (p and f values are summarized in Table 1). For the factor ‗Load‘, we found effects in four time windows: from 30 to 50 ms, 65 to 105 ms, 185 to 220 ms, and 235 to 400 ms (see Table 2). Additionally, there was a significant Gaze x Load interaction from 90-115 ms, 180-210 ms, and 255-365 ms (see Table 3).

(Please insert Table 1, 2, 3 around here)

The ERP waveforms indicated that direct gaze evoked a generally smaller P100 than averted gaze, and that this effect was preceded by a greater negative central deflection for averted gaze (see Figure 5, A). Moreover, gaze induced modulation of the N170, with a greater negative deflection for direct gaze. Furthermore, we found that direct gaze enhanced the fronto-central P200. A very early effect of load was observed, with increased amplitude for repeated faces on left temporal sites around 50 ms (see Figure 5, B). This effect was followed by a greater negative right and central deflection for new faces. Furthermore, at the P200 latencies, repeated faces 19

evoked a more negative deflection in temporal sites, coupled with greater amplitude in right central sites. In the last phase of this component, the P200 component was clearly larger for new faces than for repeated faces. With respect to the interaction Gaze x Load, we found that the first effect appeared around the P100, and subsequently at the P200 latency.

(Please insert Figure 5 around here)

Topographic and field strength analysis

The TANOVA revealed differences of the scalp topographies with a significant main effect of Load, at 175 –200 ms (p =0.0120), and at 245-400 ms (p=0.004). Furthermore, the analysis revealed a main effect of Gaze from 120 to 136 ms (p=0.027), and a significant interaction between Gaze x Load from 280 to 295 ms (p=0.023) The GFP analysis revealed a significant main effect of Load in the time window from 40 to 55 ms (p =0.011), and from 340 to 400 ms, and a significant main effect of Gaze from 108 to 114 ms (p=0.048). Furthermore, the GFP analysis showed an interaction effect Gaze x Load from 280 to 360 ms (p=0.048). Post hoc analysis confirmed that the TANOVA, GFP main effects, and the interaction effect were stable and consistent over these time periods.

Analysis in the source space 20

We focused the source space analysis on the time windows where the analyses at the sensor level revealed significant differences for the factors ‗Load‘ and ‗Gaze‘. For the Gaze effect, we examined the following time windows: from 100-120ms and from 130-150 ms. These two periods corresponded respectively to the end of the P100 and to the beginning of the N170. Global Map Dissimilarity was calculated between successive time points for the ERP of each subject and condition and the individual dissimilarity peak between 100-150 ms was used as individual anchor point to distinguish the two ERP components (see Brandeis & Lehmann, 1986; Michel et al., 1992). In the first time window (100-120ms), increased activation was found in the right fusiform gyrus for averted gaze (see Figure 6), and in the left inferior frontal gyrus (orbital part) for direct gaze (Negative t values indicate increase of activation for faces with averted gaze, positive t values indicate increase of activation for faces with direct gaze).

(Please insert Figure 6 around here)

Activation foci, p values, t values and cohen‘s d effect size measures are summarized in Table 4. In the second time window (130-150ms), increased activation was detected in the right middle temporal gyrus for direct gaze (see Table 4).

(Please insert Table 4 around here)

21

For the ‗Load‘ effect, we examined the early significant time window between 40-55 ms. We found increased activations for ‗repeated‘ faces in the precuneus, in the left superior parietal gyrus and in the posterior and median cingulate gyrus (Positive t values indicate increase of activation for ‗repeated conditions‘, negative t values for ‗new condition‘). Furthermore, for ‗new‘ faces we detected statistically significant effects in the left inferior and left medial temporal lobe. Activation foci, p values, Cohen‘s d effect size measures are summarized in Table 5.

(Please insert Table 5 around here)

Additional analysis of the early repetition effect

We found early effects of face repetition. While such effects have been described previously, special care has to be taken to assure that such very early effects are not due to methodological flaws and that they are robust also on the single subject level. We therefore here performed additional analysis focusing on this early repetition effect to support its robustness. This additional analysis concerns the use of non-causal vs. causal filters (Rousselet, 2012), analysis of the effect size by comparing it to baseline activity, and showing additional plots of individual subjects. We re-analyzed the data for repeated and non-repeated faces using a causal instead of the noncausal filter. High and low pass filters were set to 0.03 and 40 hz ( 2nd order filters, with -12 db/octave roll-off, computed in a single forward pass). We kept the same epochs as in the previous analysis with the non-causal filter and focused the statistical test on the early time 22

window: -100 ms before stimulus onset to 100 ms post stimulus onset. A randomization test was again conducted for all the 204 channels recorded (10000 permutations, p values <0.05 and effects persisting for at least 10 ms were considered reliable, for details on the permutation procedure see above). Significant effects were again found at 30-60 ms on the same left temporal electrodes (p= 0.004) as in the above analysis. In order to better determine the effect size of the early effect we analyzed the pre-stimulus baseline period comparing repeated vs. non-repeated faces. The randomization test revealed no significant effects for the pre-stimulus baseline (ps >0.05) (Fig.7). This analysis confirmed and even strengthened, the effects found at 50 ms. Therefore, it indicates that this early effect was not influenced by the non-causal filter.

4.

Discussion

In the present work, using high-density ERP analysis, we described the spatiotemporal dynamics of implicit gaze perception during a picture matching task. As previously documented (i.e. Senju et al., 2003; Senju et al., 2008), we found that direct gaze leads to faster response times than averted gaze. However, surprisingly, we showed that faces with averted gaze were recognized more accurately than faces with direct gaze. This effect is in contradiction with earlier studies, which displayed better performance for faces with direct gaze (see Senju & Johnson, 2009). Nevertheless, our participants here were

23

engaged in a face recognition task and were unaware of the gaze manipulation. Hence, while the memory task about face identity was explicit, the processing of gaze is likely to have occurred at an implicit level. Moreover, although previous studies have shown that recognizing familiar faces with direct gaze is an easier task than recognizing faces with averted gaze (see Hood et al., 2003; Vuilleumier et al., 2005), in this study we asked to quickly discriminate ‗new‘ and ‗repeated‘ faces presented for 1 second. Thus, paradoxically our results show that the goal to quickly score the identity of a face as ‗new‘, which could also be re-conceptualized as a not known person, was an easier task for faces with averted gaze. The spatiotemporal dynamics of gaze processing shed further light on this finding. We demonstrated that averted gaze induced an increase in the P100 amplitude over occipital electrodes also reflected by increased global field power. Interestingly, activations found at this latency seem to suggest that this effect was not purely attributable to visual differences. Indeed, we found a stronger activation in the right fusiform gyrus for averted gaze, and in the orbital frontal gyrus for direct gaze. There is evidence that an increase in attention on specific stimulus properties increases activity in brain regions in which these stimuli are represented (see Pourtois et al., 2013), and recent studies have demonstrated that the fusiform region increases activation during enhanced attention to faces (see George & Conty, 2008). Therefore, the early fusiform activation for averted gaze detected in our participants is in line with their behavioral performance. Furthermore, there is also evidence that emotional stimuli may influence visual perception (see Vuilleumier & Pourtois, 2007; Vuilleumier, 2015), and emotional attention influences

24

early responses in limbic regions, such as the orbifrontal gyrus (Pourtois et al., 2013). Interestingly, the orbitofrontal cortex is a region implicated in representing stimulus reward value, and is known to have more pronounced activity during the observation of attractive faces (O‘Doherty et al., 2003; Cloutier et al., 2008). Previous studies have reported that faces with averted gaze are perceived as less pleasant than faces with direct gaze (see Ewing et al., 2006; Strick et al., 2008; Schmtz et al 2012). It is thus possible that averted gaze was implicitly processed as a less pleasant social stimulus than direct gaze. Consequently, we could assume that the activation found in the right fusiform gyrus is the result of a better ability to recognize a face with averted gaze, and we could further speculate that the activation found in the orbitofrontal cortex also accounts for a different attractiveness impact of our conditions. We have also shown that, at the beginning of the N170, direct gaze induced changes in scalp topography, with a stronger activation in the right medial temporal gyrus. The N170 has been traditionally associated with structural encoding of faces (Eimer & Holmes, 2002). Some studies have displayed the impact of gaze direction on the N170 (see George & Conty, 2008; Itier & Batty, 2009; Vuilleumier & Pourtois, 2007), and other work has failed to find this effect (Klucharev & Sams, 2004; Grice et al., 2005; Schweinberger et al., 2007). It is possible that in our study the N170 effect is a consequence of the task demand (recognize faces identity), and that this effect reflects an attempt to extract identity information from gaze direction. This hypothesis seems to be supported by the activation in the right medial temporal gyrus (see Rossion et al., 2003). Furthermore, we found that direct gaze enhanced the fronto-central P200. Several studies reported differently modulated P200 for threatening images (Carretié et al., 2001; Eimer & Holmes, 2002, Eimer et al., 2003; González-Roldan et al., 2011). Therefore, the P200 25

modulation found in our study for direct gaze may be due to different emotional valence of direct vs. averted gaze. To further investigate the emotional properties of our stimuli, participants performed an emotional rating. Neutral faces with direct gaze have been described as sign of approach (i.e. angry), and neutral faces with averted gaze have been described as a sign of avoidance (i.e. sadness) (see Adams et al., 2003a). Conversely, in this study we only found a slight shift towards negative attribution for averted gaze. Nevertheless, one should take into account that the participants were exposed to the stimuli during the preceding n-back task, and that it is possible that this rating task could have revealed post-hoc evaluation strategies that do not reflect implicit preferences or automatic emotional sensorial processing during the task. Our data also indicate a dissociation between an identity discrimination task in which gaze was implicitly processed, and an emotional task in which gaze was also implicitly processed. If this is true, then this could also explain previous divergent behavioural results on neutral faces, during explicit gaze recognition (Conty et al., 2007), and implicit gaze detection in a gender categorization task (George et al., 2001). This study provides also evidence of an early dissociation between explicit identity recognition and implicit gaze detection. At the behavioral level, we found a clear advantage for repeated faces compared to new faces. ERP amplitude analysis and the TANOVA highlighted that this difference starts already at around 50 ms. There is evidence that, depending on the different types of information conveyed by a face, quick categorization of specific features can occur with impressively short latencies (see Michel et al., 2004), such as identity (~50 ms) (Seeck et al., 1997; Braeutigam et al. 2001), or gender (45–85 ms) (Mouchetant-Rostaing et al., 2000). The face-specificity of these early responses was however questioned (George et al., 1997) and was rather attributed to basic 26

mnestic aspects of visual stimuli (Seeck et al., 1997). Our results are in line with such interpretation of very fast recognition of a repeated stimulus. Since we did not use other stimuli than faces we cannot conclude that this early effect was specific to faces in our study. However, other studies reported similar early differential responses to repeated and nonrepeated stimuli that were specific to faces and did not appear with other similarly complex object categories (Braeutigam et al., 2001; Mouchetant-Rostaing et al., 2000). Evidences of early coarse stimulus identification has been reported for other type of visual categorization as reviewed in Michel et al. (2004). It has been discussed in the context of the integrated model of information processing formulated by J. Bullier (2001) and C. Schroeder (1998) where a fast feed-forward processing through the dorsal stream takes place and then influences in a top-down manner subsequent local analysis. Specifically, for face processing, it has been proposed that these early effects may indicate an automatic mechanism of global rapid object categorization (see Mouchetant-Rostaing et al., 2000, Michel et al., 2004). EEG source imaging revealed that ‗repeated‘ faces at 45 ms induced an increase of activation in a wide network of brain areas (see table 2) known to be involved in WM tasks (see Owen et al., 2005), and also known as part of the ‗social brain network‘ (see Gobbini & Haxby, 2006). Interestingly, in the study of Seeck et al. (1997) such early differential effects were observed on the individual level in seven epileptic patients with implanted intracranial electrodes. The effects were widespread and detected on different electrodes in each patient. Still, the effects were significantly more observed on electrodes in the middle and inferior temporal neocortex. As these activations were found during initial stimulus processing, this network might be involved in automatic, early, and effortless face identification. As appealing as such interpretation might be, further studies would be needed that explicitly study the facespecificity of this early effect. 27

To conclude, we have shown that implicit averted gaze processing starts initially through the activation of the right fusiform gyrus, and a decrease of activation in the orbitofrontal cortex. We have speculated that this early P100 modulation reflects a stronger impact of averted gaze, processed implicitly as a less pleasant social stimulus. We have also shown that gaze detection may affect behavior in an implicit way. Finally, the present work also suggests that explicit processing of the identity of a face occurs very early in time and faster than the implicit processing of gaze direction. The present study shed light on a potential link between early top down cortical effects and the behavioral ability to implicitly recognize gaze direction. These findings support a neural model of gaze processing that is more complex than traditionally proposed.

Funding:

This study was funded by the National Center of Competence in Research (NCCR) ‗‗SYNAPSY— The Synaptic Bases of Mental Diseases‘‘financed by the Swiss National Science Foundation (Grant no. 51AU40_125759) to C.B., T.A.R., C.P., A. D., J.M.A. and C.M. Additional support came from the National Science Foundation to C.M. (grant No. 320030_159705).

Acknowledgements:

28

The study is supported by the National Center of Competence in Research project ―Synapsy: the Synaptic Basis of Mental Diseases‖. The Cartool software is freely available academic software that has been programmed by Denis Brunet, from the functional brain mapping lab in Geneva. The STEN toolbox (http://www.unil.ch/fenl/Sten) has been programmed by JeanFrançois Knebel, from the Laboratory for Investigative Neurophysiology, Lausanne. Both labs are supported by the Center for Biomedical Imaging (CIBM) of Geneva and Lausanne .

29

5. Bibliography

Acunzo, D.J., Mackenzi,e G., van Rossum, M.C. (2012) Systematic biases in early ERP and ERF components as a result of high-pass filtering. Journal of Neuroscience Methods, 209(1):212-8.

Adams, R.B. Jr., Franklin, R.G. Jr., Kveraga, K., Ambady, N., Kleck, R.E., Whalen, P.J., Hadjikhani, N., & Nelson, A.J. (2012) Amygdala responses to averted vs direct gaze fear vary as a function of presentation speed. Social Cognitive and Affective Neuroscience, 7(5), 56877.

Adams, R.B. Jr., & Kleck, RE. (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion, 5(1), 3-11.

Adams, R.B.Jr., & Kleck, R.E. (2003 a) Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14(6), 644-7.

Adams, R.B. Jr., Gordon, H.L., Baird, A.A., Ambady, N., & Kleck, R.E. (2003 b) Effects of gaze on amygdala sensitivity to anger and fear faces. Science, 300(5625), 1536. 30

Adolphs, R. (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behavioral and cognitive neuroscience reviews, 1(1), 21-62.

Artuso, C., Palladino, P., & Ricciardelli, P. (2012) How do we update faces? Effects of gaze direction and facial expressions on working memory updating. Frontiers in Psychology, 3, 362.

Batty, M., & Taylor, M.J. (2003) Early processing of the six basic facial emotional expressions. Cognitive Brain Research, 17(3), 613-20.

Bentin, S., Allison, T., Perez, E., Puce, A., & McCarthy, G. (1996). Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience, 8, 551–565.

Benton, C. P. (2010) Rapid reactions to direct and averted facial expressions of fear and anger. Visual Cognition, 18 (9).

31

Berchio, C., Rihs, T.A., Michel, C.M., Brunet, D., Apicella, F., Muratori, F., Gallese, V., & Umiltà, M.A. (2013) Parieto-frontal circuits during observation of hidden and visible motor acts in children. A high-density EEG source imaging study. Brain Topography, 27(2), 258-70.

Birot, G., Spinelli, L., Vulliémoz, S., Mégevand, P., Brunet, D., Seeck, M., & Michel, C.M. (2014) Head model and electrical source imaging: a study of 38 epileptic patients. Neuroimage Clinical. 5, 77-83.

Braeutigam, S., Bailey, A.J., & Swithenby, S.J. (2001) Task-dependent early latency (30-60 ms) visual processing of human faces and other objects. Neuroreport, 12(7),1531-6.

Brandeis, D., & Lehmann, D. (1986) Event-related potentials of the brain and cognitive processes: approaches and applications. Neuropsychologia, 24(1), 151-68.

Brandeis, D., Naylor, H., Halliday, R., Callaway, E., Yano, L. (1992) Scopolamine effects on visual information processing, attention, and event-related potential map latencies. Psychophysiology, 29:315-336.

32

Brotman, M.A., Rich, B.A., Guyer, A.E., Lunsford, J.R., Horsey, S.E., Reising, M.M., et al. (2010) Amygdala activation during emotion processing of neutral faces in children with severe mood dysregulation versus ADHD or bipolar disorder. Am J Psychiatry, 167(1), 61-9.

Brunet, D., Murray, M.M., & Michel, C.M. (2011) Spatiotemporal analysis of multichannel EEG: CARTOOL. Journal Computational Intelligence and Neuroscience, 2011(2).

Bullier J. (2001) Integrated model of visual processing. Brain Research Reviews, 36(2-3):96107. Review.

Carlin, J.D., & Calder, A.J. (2013) The neural basis of eye gaze processing. Current opinion in neurobiology, Review, 23(3), 450-5.

Carretié L, Mercado F, Tapia M, Hinojosa JA. (2001) Emotion, attention, and the 'negativity bias', studied through event-related potentials. Int J Psychophysiol. 41(1):75-85.

Conty, L., N'Diaye, K., Tijus, C., & George, N. (2007) When eye creates the contact! ERP evidence for early dissociation between direct and averted gaze motion processing. Neuropsychologia, 45(13), 3024-37. 33

Cloutier, J., Heatherton, T.F., Whalen, P.J., & Kelley, W.M. (2008) Are attractive people rewarding? Sex differences in the neural substrates of facial attractiveness. Journal of Cognitive Neuroscience, 20(6), 941-51.

Dalrymple KA, Oruç I, Duchaine B, Pancaroglu R, Fox CJ, Iaria G, Handy TC, Barton JJ. (2011) The anatomic basis of the right face-selective N170 IN acquired prosopagnosia: a combined ERP/fMRI study. Neuropsychologia, 49(9):2553-63.

Deffke I, Sander T, Heidenreich J, Sommer W, Curio G, Trahms L, Lueschow A. (2007) MEG/EEG sources of the 170-ms response to faces are co-localized in the fusiform gyrus. Neuroimage, 35(4):1495-501.

Degabriele R, Lagopoulos J, Malhi G. (2011) Neural correlates of emotional face processing in bipolar disorder: an event-related potential study. Journal of Affective Disorders, 133(12):212-20.

Doi, H. Sawada, R., & Masataka, N. (2007) The effects of eye and face inversion on the early stages of gaze direction perception an ERP study. Brain Research, 1183:83-90.

34

Ducati, A., Fava, E., & Motti, E.D. (1988) Neuronal generators of the visual evoked potentials: intracerebral recording in awake humans. Electroencephalography and Clinical Neurophysiology/Evoked Potentials Section, 71(2), 89-99.

Farroni, T., Csibra, G., Simion, F., & Johnson, M.H. (2002) Eye contact detection in humans from birth. Proceeding of the National Academy of Sciences of the United States of America, 99(14), 9602-5.

Frischen, A., Bayliss, A.P., & Tipper, S.P. (2007) Gaze cueing of attention: visual attention, social cognition, and individual differences. Psychological bulletin, Review, 133(4), 694-724.

Eger, E., Jedynak, A., Iwaki, T., & Skrandies, W. (2003) Rapid extraction of emotional expression: evidence from evoked potential fields during brief presentation of face stimuli. Neuropsychologia, 41(7), 808-17.

Egger, H.L., Pine, D.S., Nelson, E., Leibenluft, E., Ernst, M., Towbin, K.E., & Angold, A. (2011) The NIMH Child Emotional Faces Picture Set (NIMH-ChEFS): a new set of children's facial emotion stimuli. International Journal of Methods in Psychiatric Research, 20(3), 14556. 35

Eimer M, Holmes A. (2002) An ERP study on the time course of emotional face processing. Neuroreport, 3(4):427-31.

Eimer M., Holmes A., McGlone F. (2003) The role of spatial attention in the processing of facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, and Behavioral Neuroscience, 3:97–110.

Engell AD1, McCarthy G. (2011) The relationship of gamma oscillations and face-specific ERPs recorded subdurally from occipitotemporal cortex. Cerebral Cortex, 21(5):1213-21.

Ewing, L., Rhodes, G., & Pellicano, E. (2006) Have you got the look? Gaze direction affects judgements of facial attractiveness. Visual Cognition, 9(3), 270–277.

George, N., & Conty, L. (2008) Facing the gaze of others. Neurophysiologie Clinique/Clinical Neurophysiology, 38(3), 197–207.

36

George, N., Driver, J., & Dolan, R.J. (2001) Seen gaze-direction modulates fusiform activity and its coupling with other brain areas during face processing. Neuroimage, 13(6 Pt 1):110212.

George, N., Evans, J., Fiori, N., Davidoff, J., & Renault B. (1996) Brain events related to normal and moderately scrambled faces. Cognitive Brain Research 4(2), 65-76.

Gobbini, M.I., & Haxby, J.V. (2006) Neural response to the visual familiarity of faces. Brain Res Bull, 71(1-3),76-82.

Grave de Peralta Menendez, R., Gonzalez Andino, S., Lantz, G., Michel, C.M., & Landis T. (2001) Noninvasive localization of electromagnetic epileptic activity. I. Method descriptions and simulations. Brain Topography, 14(2), 131-7.

Green, D.M., & Swets J.A. (1966) Signal Detection Theory and Psychophysics. New York: Wiley.

37

González-Roldan AM, Martínez-Jauand M, Muñoz-García MA, Sitges C, Cifre I, Montoya P. (2011) Temporal dissociation in the brain processing of pain and anger faces with different intensities of emotional expression. Pain, 152(4):853-9.

Grice, S.J., Halit, H., Farroni, T., Baron-Cohen, S., Bolton, P., & Johnson, M.H. (2005) Neural correlates of eye-gaze detection in young children with autism. Cortex, 41(3), 342-53.

Haimovic, I.C, & Pedley, T.A. (1982) Hemi-field pattern reversal visual evoked potentials. I. Normal subjects. Electroencephalography and Clinical Neurophysiology, 54(2), 111-20.

Hall, J., Whalley, H.C., McKirdy, J.W., Romaniuk, L., McGonigle, D., McIntosh, A.M., Baig, B.J., et al (2008) Overactivation of fear systems to neutral faces in schizophrenia. Biological Psychiatry, 64(1),70-3.

Hoffman, E.A., & Haxby, J.V. (2000) Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nature Neuroscience, 3(1), 80-4.

Eimer, M, Holmes A (2002) An ERP study on the time course of emotional face processing. Neuroreport, 13 (4): 427-431. 38

Hood. B., Macrae, C. N., Cole-Davies, V., & Dias, M. (2003) Eye remember you: the effects of gaze direction on face recognition in children and adults. Developmental Science, 6(1) 67– 71.

Itier, R.J., & Taylor, M.J. (2002) Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: a repetition study using ERPs. Neuroimage, 15(2), 353-72.

Itier, R.J., & Taylor, M.J. (2004) Source analysis of the N170 to faces and objects. Neuroreport, 15(8), 1261-5.

Itier, R.J. & Batty, M. (2009) Neural bases of eye and gaze processing: the core of social cognition. Neuroscience & Biobehavioral Reviews, 33(6), 843-63.

Itier, R.J., Latinus, M. & Taylor, M.J. (2006) Face, eye and object early processing: what is the face specificity?. Neuroimage, 29(2), 667-76.

39

Itier, R.J., Alain, C., Kovacevic, N., & McIntosh, A.R. (2007) Explicit versus implicit gaze processing assessed by ERPs. Brain Research, 1177:79-89.

Jonas, J,, Rossion, B., Krieg, J., Koessler, L,, Colnat-Coulbois, S., Vespignani, H., Jacques, C., Vignal, J.P., Brissart, H., Maillard, L. (2014) Intracerebral electrical stimulation of a faceselective area in the right inferior occipital cortex impairs individual face discrimination. Neuroimage, 99:487-97.

Klucharev, V., & Sams, M. (2004) Interaction of gaze direction and facial expressions processing: ERP study. Neuroreport, 15(4), 621-5.

Karniski W., Blair R.C., Snider A.D. (1994) An exact statistical method for comparing topographic maps, with any number of subjects and electrodes. Brain Topogr. 6:203-10.

Knebel, J.F., Javitt, D.C., Murray, M.M. (2011) Impaired early visual response modulations to spatial information in chronic schizophrenia. Psychiatry research, 193(3):168-76.

40

Koenig, T. & Melie-Garcia, L.(2009) Statistical analysis of multichannel scalp field data, in Electrical Neuroimaging, Michel, C.M., Koenig, T. Brandeis, D., Gianotti, L. R. R. &Wackermann, J. Eds., pp. 169–189, Cambridge University Press, Cambridge, UK, 2009.

Koenig, T., & Melie-García, L. (2010) A method to determine the presence of averaged event-related fields using randomization tests. Brain Topography, 3, 233–242.

Koenig, T., Kottlow, M., Stein, M., & Melie-García, L. (2011) Ragu: a free tool for the analysis of EEG and MEG event-related scalp field data using global randomization statistics. Journal Computational Intelligence and Neuroscience, 2011:938925.

Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H.J., Hawk, S.T. & van Knippenberg, A. (2011) Effects of perceived mutual gaze and gender on face processing and recognition memory.Visual Cognition, 24 (8), 85-101.

Lee, E., Kang, J.I., Park, I.H., Kim, J.J., An, S.K. (2008) Is a neutral face really evaluated as being emotionally neutral? Psychiatry Research, 157(1-3), 77-85.

41

Pascual-Marqui, R.D., Lehmann, D. (1993) Electroencephalogr Clin Neurophysiol. 88(6):530-1, 534-6.

Lehmann, D., & Skrandies, W. (1980) Reference-free identification of components of checkerboard-evoked multichannel potential fields. Electroencephalography and Clinical Neurophysiology, 48(6), 609-21.

Lehmann, D., 1987. Principles of spatial analysis. In: Gevins, A.S., Remond, A. (Eds.), Methods of Analysis of Brain Electrical and Magnetic Signals. Handbook of ElectroEncephalographie and Clinical Neurophysiology. Revised Series Elsevier, Amsterdam, pp. 309–354.

Leppänen, J.M., Milders, M., Bell, J.S., Terriere, E., & Hietanen, J.K. (2004) Depression biases the recognition of emotionally neutral faces. Psychiatry Research, 128(2),123-33.

Liu, J., Harris, A., Kanwisher, N. (2002) Stages of processing in face perception: an MEG study. Nature Neuroscience, 5 (9): 910-916.

42

Luck, S.J. An introduction to the event-related potential technique. MIT Press; Cambridge, MA: 2005.

Macmillan, N. A., & Creelman, C. D. (1990) Response bias: Characteristics of detection theory, threshold theory, and "nonparametric" indexes. Psychological Bulletin, 107(3), 401413.

Michel, C.M., & Murray, M.M. (2012) Towards the utilization of EEG as a brain imaging tool. Neuroimage, Review, 61(2), 371-85.

Michel, C. M., Koenig, T., Brandeis, D., Giannotti L. R. R., Wackermann, J., Electrical Neuroimaging, Cambridge University Press, Cambridge, UK, 2009.

Michel, C.M., He, B. Mapping and source imaging (2011) in Schomer, D.L. & Lopes da Silva, F.H. Niedermeyer's electroencephalography: basic principles, clinical applications, and related fields. Sixth edition, Lippincott Williams & Wilkins, 2011.

Michel, C.M., Seeck, M., & Murray, M.M. (2004) The speed of visual cognition. Supplements to Clinical Neurophysiology, Review, 57:617-27.

43

Michel, C.M., Lehmann, D., Henggeler, B., & Brandeis, D. (1992) Localization of the sources of EEG delta, theta, alpha and beta frequency bands using the FFT dipole approximation. Electroencephalography and Clinical Neurophysiology, 82(1), 38-44.

Murray, M.M., Brunet, D., & Michel, C.M. (2008) Topographic ERP analyses: a step-by-step tutorial review. Brain Topography, Review, 20(4), 249-64.

Murray M.M., De Lucia M., Brunet D., Michel C.M. Principles of Topographic Analyses for Electrical Neuroimaging. In T.C. Handy (Ed.). Brain Signal Analysis: Advances in Neuroelectric and Neuromagnetic Methods. MIT Press, 2009, pp. 21-54.

Mouchetant-Rostaing, Y., Giard, M.H., Bentin, S., Aguera, P.E., & Pernier, J. (2000) Neurophysiological correlates of face gender processing in humans. European Journal of Neuroscience, 12(1), 303-10.

Nakashima, S.F., Langton, S.R., & Yoshikawa, S. (2012) The effect of facial expression and gaze direction on memory for unfamiliar faces. Cognition & Emotion, 26(7), 1316-25.

44

Nguyen VT, Cunnington R. (2014) The superior temporal sulcus and the N170 during face processing: single trial analysis of concurrent EEG-fMRI. Neuroimage, 86:492-502.

O'Doherty, J., Winston, J., Critchley, H., Perrett, D., Burt, D.M., & Dolan, R.J. (2003) Beauty in a smile: the role of medial orbitofrontal cortex in facial attractiveness. Neuropsychologia, 41(2), 147-55.

Owen, A.M., McMillan, K.M., Laird, A.R., & Bullmore, E. (2005) N-back working memory paradigm: a meta-analysis of normative functional neuroimaging studies. Human Brain Mapping, 5(1), 46-59.

Pitts, M.A., & Britz, J. (2011) Insights from intermittent binocular rivalry and EEG. Frontiers in Human Neuroscience, 5, 107.

Pizzagalli, D., Regard, M., & Lehmann, D. (1999) Rapid emotional face processing in the human right and left brain hemispheres: an ERP study. Neuroreport, 10(13):2691-8.

45

Pernet, C.R, Latinus, M., Nichols, T.E., Rousselet, G.A. (2015) Cluster-based computational methods for mass univariate analyses of event-related brain potentials/fields: A simulation study. Journal of Neuroscience Methods, 250:85-93.

Pourtois, G., Schettino, A., & Vuilleumier, P. (2013) Brain mechanisms for emotional influences on perception and attention: what is magic and what is not. Biology Psychology, 92(3), 492-512.

Pourtois, G., Delplanque, S., Michel, C.M., & Vuilleumier, P. (2008) Beyond conventional event-related brain potential (ERP): exploring the time-course of visual emotion processing using topographic and principal component analyses. Brain Topography, Review, 20(4), 26577.

Prieto EA, Caharel S, Henson R, Rossion B. (2011) Early (n170/m170) face-sensitivity despite right lateral occipital brain damage in acquired prosopagnosia. Frontiers in human neuroscience, 5:138.

Rellecke, J., Sommer, W., & Schacht, A. (2013) Emotion effects on the n170: a question of reference?. Brain Topogr, 26(1), 62-71.

46

Rossion, B., Caldara, R., Seghier, M., Schuller, A.M., Lazeyras, F., & Mayer, E. (2003) A network of occipito-temporal face-sensitive areas besides the right middle fusiform gyrus is necessary for normal face processing. Brain, 126(Pt 11), 2381-95.

Rossion, B., Campanella, S., Gomez, C.M., Delinte, A., Debatisse, D., Liard, L., et al. (1999) Task modulation of brain activity related to familiar and unfamiliar face processing: an ERP study. Clinical Neurophysiology, 110(3), 449-62.

Rousselet GA, Ince RA, van Rijsbergen NJ, Schyns PG. (2014) Eye coding mechanisms in early human face event-related potentials. Journal of Vision, 14(13):7.

Rousselet GA. (2012) Does Filtering Preclude Us from Studying ERP Time-Courses? Frontiers in psychology, 4;3:131.

Sadeh B, Yovel G. (2010) Why is the N170 enhanced for inverted faces? An ERP competition experiment. Neuroimage, 53(2):782-9.

47

Sander, D., Didier, G., Kaisera, S., Wehrlec, T. & Scherer K.R (2007) Interaction effects of perceived gaze direction and dynamic facial expression: Evidence for appraisal theories of emotion. European Journal of Cognitive Psychology, 19(3), 470-480.

Schyns, P.G., Petro, L.S., & Smith, M.L. (2007) Dynamics of visual information integration in the brain for categorizing facial expressions. Current Biology, 17(18), 1580-5.

Schmitz, J., Scheel, C.N., Rigon, A., Gross, J.J., Blechert, J. (2012) You don't like me, do you? Enhanced ERP responses to averted eye gaze in social anxiety. Biology Psychology, 91(2):263-9.

Schweinberger, S.R., Kloth, N., & Jenkins, R. (2007) Are you looking at me? Neural correlates of gaze adaptation. Neuroreport, 18(7), 693-6.

Seeck M, Michel CM, Mainwaring N, Cosgrove R, Blume H, Ives J, Landis T, Schomer DL. (1997) Evidence for rapid face recognition from human scalp and intracranial electrodes. Neuroreport, 8(12):2749-54.

48

Seeck, M., Michel, C.M., Blanke, O., Thut, G., Landis, T., & Schomer, D.L. (2001) Intracranial Neurophysiological Correlates Related to the Processing of Faces. Epilepsy & Behavior, 2(6):545-557.

Senju, A., Yaguchi, K., Tojo, Y., Hasegawa, T. (2003) Eye contact does not facilitate detection in children with autism. Cognition, 89(1):B43-51.

Senju, A., Kikuchi, Y., Hasegawa, T., Tojo, Y. Osanai H. (2008) Is anyone looking at me? Direct gaze detection in children with and without autism. Brain and Cognition, 67(2):127-39.

Senju, A., & Johnson, M.H. (2009) The eye contact effect: mechanisms and development. Trends in cognitive sciences, 13(3), 127-34.

Sessa, P., & Dalmaso, M. (2015) Race perception and gaze direction differently impair visual working memory for faces: An event-related potential study. Social Neuroscience, 7, 1-11.

Schroeder, C.E., Mehta, A.D., Givre, S.J. (1998) A spatiotemporal profile of visual system activation revealed by current source density analysis in the awake macaque. Cerebral Cortex, 8:575 – 92.

49

Simion, F., Di Giorgio, E., Leo, I., & Bardi, L. (2011) The processing of social stimuli in early infancy: from faces to biological motion perception. Progress in Brain Research, 189, 173-93.

Skrandies W (1990) Global field power and topographic similarity. Brain Topography. 3(1):137-41.

Skrandies W. (2007)The effect of stimulation frequency and retinal stimulus location on visual evoked potential topography. Brain Topography, 20(1):15-20.

Smith ML, Fries P, Gosselin F, Goebel R, Schyns PG. (2009) Inverse mapping the neuronal substrates of face categorizations. Cereb Cortex, 19(10):2428-38.

Spinelli, L., Andino, S.G., Lantz, G., Seeck, M., & Michel, C.M. (2000) Electromagnetic inverse solutions in anatomically constrained spherical head models. Brain Topography, 13(2), 115-25.

50

Srebro R. (1996) An iterative approach to the solution of the inverse problem. Electroencephalogr Clin Neurophysiol. 100(1):25-32

Streit, M., Dammers, J., Simsek-Kraues, S., Brinkmeyer, J., Wölwer, W., & Ioannides, A. (2003) Time course of regional brain activations during facial emotion recognition in humans. Neuroscience Letters, 342(1-2), 101-4.

Strick, M., Holland, R.W., & van Knippenberg, A. (2008) Seductive eyes: attractiveness and direct gaze increase desire for associated objects. Cognition, 106(3), 1487-96.

Taylor, M.J., Edmonds, G.E., McCarthy, G., & Allison, T. (2001) Eyes first! Eye processing develops before face processing in children. Neuroreport, 12(8), 1671-6.

Tzourio-Mazoyer, N., Landeau, B., Papathanassiou, D., Crivello, F., Etard, O., Delcroix, N., Mazoyer, B., & Joliot, M. (2002) Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain. Neuroimage, 15(1), 273-89.

51

van der Schalk, J., Hawk, S.T., Fischer, A.H., & Doosje, B. (2011) Moving faces, looking places: validation of the Amsterdam Dynamic Facial Expression Set (ADFES). Emotion, 11(4), 907-20.

VanRullen R (2011) Four common conceptual fallacies in mapping the time course of recognition. Frontiers in Psychology, 2:365.

Vaughan, H.G. Jr. (1982) The neural origins of human event-related potentials. Annals of the New York Academy of Sciences, 388:125-38.

Vuilleumier, P. (2015) Affective and motivational control of vision. Current opinion in neurology, 28(1),29-35.

Vuilleumier, P., & Pourtois, G. (2007) Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia, Review, 45(1), 174-94.

52

Vuilleumier, P., George, N., Listerc, V., Armonyd, J., & Driverc, J. (2005) Effects of perceived mutual gaze and gender on face processing and recognition memory. Visual Cognition , 12 (1), 85-101.

Watanabe, S., Miki, K., & Kakigi R. (2002) Gaze direction affects face perception in humans. Neuroscience letters, 325(3), 163–166.

Widmann, A., Schröger, E., Maess, B. (2015) Digital filter design for electrophysiological data--a practical approach. Journal of Neuroscience Methods, 250:34-46.

53

Figure Captions:

Figure 1. Experimental paradigm.

Figure 2. Trial Example: (A) direct gaze condition (B) averted gaze condition.

Figure 3. Performance accuracy (d‘) (A) and reaction times (B) for face recognition. Measurements for individual subjects are plotted, asterisks (*) indicate significant differences.

Figure 4. Behavioral d‘ results: Hit Rate (A) and False Alarm (B). Performance is plotted for each individual, asterisks (*) indicate significant differences. Figure 5. Results of the ERP analysis. (A) Grand average of 14 subjects (butterfly plots) and time course of Global Field Power (GFP) elicited by direct gaze (1), averted gaze (2), ‗repeated‘ (3) and ‗non repeated‘ (4) stimuli. Light grey traces indicate standard errors. (B) ANOVA ERP amplitude results: main effect of Gaze and Load. Channels are reported on the vertical axis, time is indicated on the horizontal axis, and black lines indicate significant p values (p <0.05). Topo plot maps indicate the location of electrodes for the statistically significant differences between conditions. Figure 6. EEG source imaging results. ‗Averted gaze‘ versus ‗direct gaze‘ condition from 100 to 120 ms, displaying increased activation at P100 of the right fusiform gyrus for faces with averted gaze, and of the left orbitofrontal gyrus for faces with direct gaze. Figure 7. ERP waveforms filtered with a causal filter: early repetition effect. (A) The time course of a cluster of selected electrodes with their standard errors is displayed (grand average). (B) ERP amplitude comparison between ‗new‘ and ‗repeated‘ faces: black lines indicate significant differences. (C) Location of electrodes that revealed statistically significant differences between conditions. (D) Difference between the new and repeated conditions (blue), and the time courses of the difference for each participant.

54

Tables Table 1. Amplitude results, 2 x 2 ANOVA: p and f values.

Main effect: Gaze ms Values:

50-75 p

110-140 f

p

f

120-140

170-190

290-340

360-400

p

p

p

p

f

f

f

f

Electrodes (around): 0.02 71 8.5548

fz

0.03 8.769 06 9

f4 c3

0.01 12.50 95 17

c4

0.01 11.41 44 87

o1 oz o2

55

0.02 8.11 0.01 11.83 20 44 43 03

0.02 8.299 0.02 8.28 0.02 10.24 28 2 57 20 18 58 0.01 11.269 63 35

0.03 7.944 55 6 0.01 14.74 59 80 0.01 14.73 65 68

Table 2. Amplitude results, 2 x 2 ANOVA: p and f values.

Main effect: Load ms Values:

30-50 p

65-105 f

p

185-220 f

p

f

235-400 p

f

Electrodes (around): f3

56

0.0334 9.6294

0.0108 18.2308

fz

0.0053 29.2502

c3

0.0124 18.8731

cz

0.0105 21.8412

c4

0.0225 11.2809 0.0231 12.1757 0.0117 14.0820

t3

0.0129 12.0413

0.0183 11.4746

t5

0.0140 8.6666 0.0214 7.4580 0.0117 12.3370

Table 3. Amplitude results, 2 x 2 ANOVA: p and f values.

Interaction: Gaze*Load ms

90-115

180-210

255-365

Values: Electrodes (around):

p

f

f

p

f

f3

0.0168 10.6887

f7

0.0253 9.0559

c3

0.009 16.5527

cz

0.0220 9.2600

c4

0.0246 9.8291

t3

0.0131 12.1433

o1

0.0209 10.3099

oz

0.0195 10.7506

o2

0.0297 9.0878

pz

57

p

0.0204 10.1528

Table 4. EEG source analysis results: direct gaze vs. averted gaze.

Gaze main effect 100-120 ms

p

t

Cohen's d

0.0375

-2.2702

0.8510

0.0408

2.3153

0.0168

0.0433

2.3132

1.9979

Fusiform Gyrus Right Orbital frontal gyrus, superior part Left 130-150ms Inferior temporal gyrus Right

58

Table 5. EEG source analysis: non-repeated faces vs. repeated faces.

Load main effect: 40-55ms Cohen's d

p

t

Left

0.003

-3.5620

0.9562

Right

0.014

-3.5620

0.8940

0.044

-2.2301

1.1471

0.040

2.2847

0.3729

0.002

3.7693

0.6203

Left

0.016

-2.7674

1.5348

Right

0.011

-2.9429

1.3993

Left

0.002

-3.7926

1.1568

Right

0.019

-2.6800

0.8203

Precuneus

Superior parietal gyrus Left Medial temporal lobe Left Inferior temporal lobe Left Posterior cingulate gyrus

Median cingulate gyrus

59