Voluntary attention in Asperger's syndrome: Brain electrical oscillation and phase-synchronization during facial emotion recognition

Voluntary attention in Asperger's syndrome: Brain electrical oscillation and phase-synchronization during facial emotion recognition

Research in Autism Spectrum Disorders 13–14 (2015) 32–51 Contents lists available at ScienceDirect Research in Autism Spectrum Disorders Journal hom...

6MB Sizes 0 Downloads 1 Views

Research in Autism Spectrum Disorders 13–14 (2015) 32–51

Contents lists available at ScienceDirect

Research in Autism Spectrum Disorders Journal homepage: http://ees.elsevier.com/RASD/default.asp

Voluntary attention in Asperger’s syndrome: Brain electrical oscillation and phase-synchronization during facial emotion recognition Yi-Li Tseng a,b, Han Hsuan Yang c, Alexander N. Savostyanov d,e, Vincent S.C. Chien a, Michelle Liou a,f,* a

Institute of Statistical Science, Academia Sinica, Taipei 115, Taiwan Department of Electrical Engineering, Fu Jen Catholic University, New Taipei City 24205, Taiwan Institute of Psychology, Fo Guang University, Jiaosi, Yilan County 26247, Taiwan d State Research Institute of Physiology and Fundamental Medicine, Siberian Branch of Russian Academy of Medical Sciences, Novosibirsk, Russia e Psychology Department, National Research Tomsk State University, Tomsk, Russia f Imaging Research Center, Taipei Medical University, Taipei 110, Taiwan b c

A R T I C L E I N F O

A B S T R A C T

Article history: Received 5 October 2014 Received in revised form 5 January 2015 Accepted 21 January 2015

This study investigated electroencephalography (EEG) oscillatory activity and phasesynchronization in patients with Asperger’s syndrome (AS) during visual recognition of emotional faces. In the experiment, 10 AS adults (2 females, age 19.6  1.96) and 10 IQmatched controls (3 females, age 24.4  3.24) participated in tasks involving emotionality evaluation of either photograph or line-drawing faces. Emotional faces elicited comparable reaction times and evaluation scores between the two groups. In the photograph task, the AS group had no visible N400 component and lower delta/theta synchronization (350–450 ms post-stimulus onset) in the temporal and occipital–parietal regions, and much weaker phase synchronization between distant scalp regions (200–500 ms post-stimulus onset) compared with the control group. In the line-drawing task, the two groups had the same degree of delta/ theta synchronization in the central and occipital–parietal regions and comparable phase synchronization between scalp regions. We conclude by hypothesizing that AS patients might have structural deficits in the amygdala and its related limbic structures, a site critical for recognition of emotional faces beyond conscious awareness, but that they preserve the intact function in the cognitive pathway to keep up comparable behavioral performances with the healthy controls through voluntary control of attention. ß 2015 Elsevier Ltd. All rights reserved.

Keywords: Asperger syndrome EEG Facial emotion recognition Spatial frequency Event-related spectral perturbations (ERSP) Phase synchronization

1. Introduction Asperger’s syndrome (AS) is a type of autism spectrum disorder (ASD) (Baskin, Sperber, & Price, 2006; McPartland & Klin, 2006), more frequently observed in males than in females (Baron-Cohen et al., 2011). Unlike autism, AS children and adults display relatively higher linguistic and cognitive abilities (McPartland & Klin, 2006), but have severely impaired social understanding and reciprocity, pragmatic difficulties, and unusual circumscribed interests. The newly released Diagnostic

* Corresponding author at: Institute of Statistical Science, Academia Sinica, Taipei 115, Taiwan. Tel.: +886 [8_TD$IF]936792145; fax: +886 [9_TD$IF]2 27831523. E-mail address: [email protected] (M. Liou). http://dx.doi.org/10.1016/j.rasd.2015.01.003 1750-9467/ß 2015 Elsevier Ltd. All rights reserved.

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

33

and Statistical Manual of Mental Disorders (DSM-5; APA, 2013) has subsumed AS into the overarching category of ASDs as distinctions between disorders were found to be inconsistent over time. However, a recent review found approximately three times as many studies showing significant differences in the comparison between disorders as there are studies indicating no difference between AS and autistic patients (Tsai et al., 2013). The same review also suggested that there are more than 90 clinical variables informative for probing a significant difference between the two types of disorders, such as emotion perception, verbal fluency, lexical processing, and pragmatic inferences. Asperger’s syndrome represents an atypical mental disturbance affecting sensory, affective and communicative abilities, without interfering normal linguistic skills and intellect levels. Behavioral disorders observed in AS children can be diagnosed in the first three years of life, a period during which their voluntary (or conscious) control over behaviors is not fully developed (APA, 1994). In AS adults, the behavioral disorders can be compensated for through attention regulation (Basar-Eroglu, Kolev, Ritter, Aksu, & Basar, 1994). We hypothesize that violation of social behaviors in AS patients (mainly in children) could be a result of deficits in the recognition of social-related stimuli beyond conscious awareness, particularly in facial emotions and speech intonations, that may be compensated for by voluntary control of attention. Facial emotion recognition is one of the most important brain processes engaged in social communications; a variety of mental disorders are related, at some level, to problems with explicit detection of emotions in faces (Kano et al., 2003; Phan, Wager, Taylor, & Liberzon, 2002; Williams et al., 2007). Several studies have found that functional deficits in the neural circuitry of face processing important for facial emotion recognition can partly explain the social communication failure in AS patients (Behrmann, Thomas, & Humphreys, 2006; Gross, 2004). For instance, there are delayed or smaller early P1 and N170 components in AS children and adults during face recognition, suggesting possible deficits in early visual processing (O’Connor, Hamm, & Kirk, 2005, 2007). Moreover, there are weaker early theta synchronization resulting from disorders in the subcortical–cortical connection, and stronger late beta2 desynchronization that may be regarded as a sign of compensatory mechanism involved in voluntary control of attention (Yang, Savostyanov, Tsai, & Liou, 2011). Neuroimaging studies have suggested that AS patients have abnormal structure in the amygdala, and relatively weaker activity in the fusiform gyrus during face recognition as compared with healthy controls (Ashwin, Baron-Cohen, Wheelwright, O’Riordan, & Bullmore, 2007; Piggot et al., 2004). Because the amygdala and fusiform gyrus (Smith et al., 2009) are connected in the emotional/motivational circuitry, the reduced cortical activity could also be a sign of structural impairment in the amygdala (Harms, Martin, & Wallace, 2010). Studies have also found reduced activation in other regions during face recognition in AS patients, such as the medial-frontal, orbito-frontal (Ashwin et al., 2007) and extrastriate cortices (Deeley et al., 2007). Faces, similar to other objects or images, contain a spectrum of spatial information; that is, high spatial frequencies (HSFs) are related to highly detailed parts of an image such as the edges of a face, whereas low spatial frequencies (LSFs) are coarser information related to larger, less well defined parts such as face contours. In several neuroimaging studies, the spatial frequency of stimuli has been manipulated to examining hypotheses on different brain regions selective to different frequency information in stimuli (LeDoux, 2003). For instance, processing of LSFs involves more of the magnocellular pathway, while processing of HSFs involves more of the parvocellular pathway (Dakin & Frith, 2005). Specifically, LSFs in a facial stimulus mainly activate the amygdala, pulvinar, and superior colliculus especially with fearful facial expressions (Vuilleumier, Armony, Driver, & Dolan, 2003). These regions constitute the limbic structures in non-conscious perception of emotions and modulate cortical activity either directly or indirectly (Tamietto & De Gelder, 2010). HSFs, on the other hand, induce more pronounced activation in the fusiform gyrus and inferior occipital gyrus (Iidaka, Yamashita, Kashikura, & Yonekura, 2004). Difficulty with processing details within a certain spatial frequency range may indicate a disruption in different information processing stages. Studies on ASD patients have discussed the relationship between spatial frequency and face processing. For instance, behavioral data have indicated that ASD children perform better with high rather than low spatial frequency stimuli in face recognition tasks, especially in tasks involving the facial expression of emotions (Deruelle, Rondan, Gepner, & Tardif, 2004; Deruelle, Rondan, Salle-Collemiche, Bastard-Rosset, & Da Fonseca, 2008). Neuroimaging and EEG studies on ASD patients have also been performed on the relationship between spatial frequency and facial emotion recognition with, however, discrepant findings depending on the demographic characteristics of participants, task demands and types of behavioral responses (Harms et al., 2010). So far, no study has directly addressed EEG oscillatory activity and phase synchronization in AS patients during facial emotion recognition with stimuli in different spatial frequencies. It is important to examine the functional trajectory in AS patients as compared with healthy controls during processing facial stimuli with different spatial frequencies by controlling for task demands and demographic effects such as gender and IQ. A few interpretations have been made on how ASD patients process emotional faces (Behrmann, Thomas, et al., 2006). First, the possible impairment in social motivation leads to difficulties in paying attention to and in imitating socially relevant stimuli, such as face recognition and responses to emotional displays (Dawson et al., 2002; Dawson, Webb, & McPartland, 2005), which could be a result of atypical development in the amygdala (Schultz, 2005). Second, there may be deficits in inter-hemispheric connections, which is a reason for the disorder in interplay between affective and cognitive face processes (Coben, Mohammad-Rezazadeh, & Cannon, 2014). Several studies have reported a lack of hemispheric differences in EEG spectral power in ASD patients compared with those found in healthy and mentally handicapped controls (Dawson, Warrenburg, & Fuller, 1982; Ogawa et al., 1982). Some studies investigated cortical connectivity in ASD patients using EEG coherence measures, and all reported reduced connectivity, especially in those distant connections (Cantor, Thatcher, Hrybyk, & Kaye, 1986; Coben & Myers, 2008; Lazarev, Pontes, & deAzevedo, 2009). Third, ASD patients may have deficits in early visual sensory processing, a mechanism that is more critical in face recognition (Behrmann, Avidan, et al., 2006;

34

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

Behrmann, Thomas, et al., 2006; Dakin & Frith, 2005). Other studies have also suggested a developmental delay in ASD children (Johnson et al., 2005). The three interpretations can be validated in AS patients by introducing new insights into different aspects of brain electrical activity during emotional face recognition. In this study, we investigate EEG oscillatory activity and phase-synchronization during visual recognition of emotional faces in AS patients and healthy controls. In the experiment, 10 AS adults and 10 IQ-matched healthy controls participated in emotionality evaluation tasks involving either photograph or high-spatial frequency line-drawing faces. The differences between the two groups were assessed by computing ERPs, event-related spectral perturbations (ERSPs), and phase synchronization as measured by the phase lock values (PLVs). We proceed in the following steps. First, we offer details of the participants, experimental procedures, and analysis methods in the next section. We then provide analysis results on the behavioral responses and brain electrical reactions to emotional face evaluation in the two tasks. Finally, we discuss our experimental findings in light of the aforementioned interpretations and our hypothesis on the interplay between voluntary control and non-conscious (or affective) processes in AS patients. This discussion is important for probing the compensatory mechanism in AS participants and for understanding the neurobiological substrate of consciousness. 2. Methods 2.1. Participants Ten adults with Asperger’s syndrome (age 19.6  1.96 years; 2 females1) and 10 normal controls (age 24.4  3.24; 3 females) participated in this study. All AS participants were recruited from the National Taiwan University Hospital and the Taipei City Hospital. Clinical participants were diagnosed according to Gillberg (Gillberg, 1991) and DSM-IV criteria (APA, 1994) and were also confirmed by International Classification of Diseases (ICD-10; WHO, 1994) criteria. The diagnostic scales included social inference, emotional communication, language and cognitive abilities and motor coordination skills. Two of the AS participants were under therapy medication for their syndromes for two and five years, respectively, and one was off medication for six months before the experiment. The other eight AS participants had never taken any medication for therapy. Four of the AS participants reported to be hypersensitive to foods, sounds or touch. However, none of these participants had a comorbid psychiatric disorder. The verbal and performance IQ were also assessed for all participants using a clinical-derived short form of the Wechsler Adult Intelligence Scale (WAIS-III), which was standardized against a psychiatric population in Taiwan (Chiang et al., 2007). All participants were administered the short-form of WASI-III individually, with 30 min each. The AS and control groups were matched as closely as possible on their verbal and performance IQ scores. All participants gave informed written consent prior to the experiment to satisfy the requirements of the human participant research ethics committee/Institutional Review Board (IRB) at the Academia Sinica, Taiwan. 2.2. Experimental task and procedure A sample of 60 face stimuli selected from the Ekman database (Ekman & Friesen, 1976) were pretested on a cohort of five male AS adults (age 19.4  1.14) and five male controls (age 25.8  1.64), none of whom later participated in the EEG experiment. A cognitive interview protocol was included in the pretesting phase, in which each participant went through an emotionality decision on pretesting faces and was interviewed regarding reasonable duration of central-eye fixation and stimulus presentation periods as well as suitable design of the rating scale used for emotionality evaluation. The final 30 face stimuli selected for the EEG experiment had relatively smaller differences in emotionality ratings and in reaction times between the pretested AS and control participants, including faces of five females and five males with three facial expressions (angry, happy, and neutral). Two types of emotional faces were considered in the EEG experiment: photographs and line-drawings of faces. The line drawings of the 30 selected photograph faces were prepared in Adobe Photoshop 5.0 and created by tracing the edges of face images. Fig. 1 illustrates four face images used in the photograph and line-drawing tasks. Before the experiment, participants read instructions shown on the screen and practiced the emotional face recognition task five times. During the experiment, each participant was seated comfortably in a chair with eyes opened in a sound insulated (dimly lit) chamber. The face stimuli were presented via a 24.4 cm  18.3 cm monitor located 60 cm in front of the participant. After about 10 min of spontaneous EEG registration, participants were instructed to evaluate the face emotion presented on the screen. An experimental trial started with a central eye-fixation cross with 1000 ms duration, followed by an angry, neutral or happy face with 1000 ms duration in a random sequence. After the offset of a face stimulus, a horizontal line without any tick marks, except for the central and endpoints, appeared in the center of the screen, and participants were allowed to freely click on the line from very angry (scored 100 points) to very happy (scored 100 points) to indicate their decisions on emotionality of the face. There were 30 trials in the photograph task and 30 trials in the line-drawing task, with inter-trial intervals randomly assigned in the range of 4–7 s. The two tasks were administered in a counter-balanced order.

1 The two groups were matched on IQ and gender, and one of the female AS participant did not complete the task. We replaced her data with data from one male AS participant.

[(Fig._1)TD$IG]

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

35

Fig. 1. Examples of (A) a female neutral face and (B) a male neutral face used in the photograph and line-drawing tasks.

2.3. EEG recording and preprocessing The EEG signals were recorded with an electrode cap (Quik-Cap128 NSL, NeuroScan Inc., Charlotte NC) with 132 scalp locations including 122 EEG (10-10 system), VEOG, HEOG, EKG, EMG, and 6 face muscles channels, by referring to Cz with ground at FzA. Signals were digitized at a rate of 1 kHz and amplified using two 64-channel amplifiers (SynAmps, NeuroScan Inc.) with 0.1–100 Hz analog band-pass filtering. On-going EEGs were epoched from 2.0 s pre-stimulus to 1.5 s poststimulus onset. Artifacts resulting from eye movements, blinks, muscle noise, and line noise were estimated by independent component analysis (ICA). We separated brain activity from artifacts through an automatic approach based on the reference signals in the VEOG, HEOG, and EKG channels. A significantly high multiple correlation (R2 > 0.9) between the ICA scores and reference signals indicated that the particular ICA component was mainly contributed by artifacts and should be excluded from further analysis. EEGs from 2.0 to 1.2 s before the stimulus onset were selected as the baseline for the correction of ERPs, ERSPs and phase synchronization. This interval was prior to the central eye-fixation period and to the onset of the faces stimuli. 2.4. ERP and ERSP analyses ERPs and ERSPs were computed using the EEGLAB toolbox (Delorme & Makeig, 2004). In the toolbox, we applied timefrequency analysis to EEG signals using the wavelet transformation with Morlet wavelet. Before averaging ERPs and ERSPs across EEG channels, we partitioned scalp channels into 11 regions: left- (10 channels), midline- (11), and right-frontal (10); left- (17) and right-temporal (17); left- (9), midline- (9) and right-central (9); left- (9), midline- (12) and right-occipital parietal (9). ERPs and ERSPs were averaged across channels within each region for each individual participant. For each

36

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

time-frequency interval, repeated measures MANOVA were applied to testing the main effects of the task (photograph vs. line-drawing), region (11 scalp regions), and group (AS vs. control) as well as the interaction effects among task, region, and group. In the statistical analysis, we also considered gender (male vs. female) as a covariate, and all the main and interaction effects were controlled for gender differences. In the MANOVA layout, participants had repeated ERSP measures in the two tasks across the 11 regions. The ERSP measures at the 11 regions were nested within the two tasks. MANOVA assumes multivariate normality and an equal covariance matrix (among the 11 regions) across the AS and control groups. In general, the test is robust to departure from multivariate normality especially in larger sample sizes and balanced cases (equal sample sizes in different groups). Deviation from normality could make the test more conservative (i.e., less likely to see significant differences between groups). The ERSPs in our analysis were those averaged across channels and time intervals; the central limit theorem supports the normality assumption. However, the homogeneity of covariance matrices could be violated since the AS group had larger within-group variances. In MANOVA, the Pillai’s trace criterion is known to be more robust to violation of the homogeneity of covariance matrices assumption compared to the Wilks’ lambda. However, the sampling distributions of these criteria are not well understood, and commonly converted to approximate F-ratio statistic (Tabachnick & Fidell, 1996). Wilks’ and Pillai’s criteria produce identical F tests when there are only two groups (i.e., AS vs. control groups). 2.5. Phase synchronization The phase-locking value (PLV) is one of the most useful and robust measures for estimating functional connectivity between two time series (Lachaux, Rodriguez, Martinerie, & Varela, 1999; Sauseng, Klimesch, Gruber, & Birbaumer, 2008). Denote a pair of band-pass filtered EEG time series in a frequency range of interest as Xj (t) and Xk (t) at time t (Johansson, 1999; Tseng, Ko, & Jaw, 2012). The analytic form of the jth time series can be obtained by the Hilbert transformation (HT) as: pffiffiffiffiffiffiffi R1 Z[18_TD$IF]j(t) = Xj(t) + iHT (Xj(t)), where i ¼ 1; and HTðX j ðtÞÞ ¼ p1 P:V: 1 X j ðt Þ=ðt  t Þdt ; where P.V. indicates the integral taken in the sense of Cauchy principal value. The instantaneous phase is defined as: fj(t) = artan(HT(Xj(t))/Xj(t)), with fj(t) 2 [ p,

p), and fk(t) computed by analogy. The PLV of N epochs for this pair of channels at time t is define as: PLVt ¼

P   iDfnjk ðtÞ  n n n N1  N ; where Df jk ðtÞ ffi f j ðtÞ  fk ðtÞ is the phase difference in the nth epoch. This can be used to determine the n¼1 e functional connectivity of oscillatory activity between two EEG channels. The average PLVs were calculated for each pair of 122 channels in two time intervals: 2000 to 1200 ms in the baseline interval, and 200–500 ms in the post-stimulus interval. The post-stimulus interval was selected for its strong connectivity between distant channels compared with other time intervals. Each channel had a PLV with every other channel, which could be used as a feature for clustering the 122 EEG channels. We applied the Euclidean distance and hierarchical clustering method to the averaged baseline PLVs matrix in the control group. Eight clusters were selected based on the highest average silhouette value (Rousseeuw, 1987) in the hierarchical clustering tree for evaluating the between-group differences. As PLV estimates are not distributed in a Gaussian manner (Aydore, Pantazis, & Leahy, 2013), we used a randomized t-test to evaluate the group difference by first averaging the PLVs across channels within each cluster for each individual participant. The 20 averaged PLVs were randomly assigned to the AS and control groups and the independent t-test was applied to the two randomly assigned samples. The randomization procedure was repeated 10,000 times, based on which the t-value distribution under the null hypothesis was constructed. An observed t-value‘s significance (comparing the average PLVs between the control and AS groups) was evaluated against this null distribution. 3. Results In this section, we present the behavioral data for the 20 participants in the EEG experiment. MANOVA tests were focused on the between-group and -task differences in the ERP and ERSP analyses. In the ERSP and phase synchronization results, we present results on low frequency oscillation in the delta/theta (1–7 Hz) range and high frequency oscillation in the alpha/ beta (8–30 Hz) range. 3.1. Behavioral data The average verbal and performance IQ scores are listed in Table 1 for the control and AS groups. Groups are balanced, with no statistically significant differences along these dimensions. The average reaction times and average scores assigned to emotionality of faces are also listed in the table for the two groups. In general, the AS group had longer reaction times than the control group for evaluating facial emotions for angry and happy faces and shorter reaction times for neutral faces in both photograph and line-drawing tasks. However, these differences are not statistically significant as suggested by the F-test. As mentioned, AS participants had abnormal structures in the amygdale which is known to participate in the memory of emotions except for the neutral emotion (Ilyutchenok, 1981; Kleinhans et al., 2011). Our behavioral data suggest that emotional faces could delay the information selection and later retrieval in the AS participants who have slightly longer reaction times to angry and happy faces than the controls. Scores assigned to facial emotions do not significantly differ

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

37

Table 1 Behavioral data on the Wechsler adult intelligence Scale-III and the reaction time as well as emotionality scores assigned to face stimuli in the photograph and line-drawing tasks. [3_TD$IF]Wechsler Adult Intelligence Scale-III Verbal IQ

F1,16

Control

Asperger

113.8  5.79

108.0  16.68

Performance IQ

0.379 p = .547

F1,16

Control

Asperger

117.7  12.00

107.3  16.6

0.256 p = .618

Photograph task Reaction time (in ms)

F1,16

Control

Asperger

Angery

4413.4  762.62

4504.1  596.49

Neutral

3744.5  711.26

3580.1  732.25

Happy

4277.2  681.67

4573.1  832.82

0.316 p = .582 0.088 p = .771 1.786 p = .200

Emotionality score (100 to 100) Control

Asperger

55.0  16.85

50.3  16.61

6.9  7.35

4.0  8.48

57.5  15.10

57.5  22.64

F1,16

0.328 p = .575 2.142 p = .163 <0.001 p = .985

Line-drawing task Reaction time

F1,16

Control

Asperger

Angery

3804.8  557.94

4436.1  744.39

Neutral

3491.7  735.76

3484.3  861.81

Happy

3996.5  1018.64

4548.3  1277.14

4.555 p = .049 0.155 p = .699 0.883 p = .361

Emotionality score

F1,16

Control

Asperger

53.0  20.32

47.7  17.12

8.3  5.98 54.1  12.20

1.4  5.46 56.5  22.49

1.894 p = .188 20.138 p < .001 .022 p = .884

between the two groups, except for the neutral faces in the line-drawing task, where the AS group has an average score near zero. 3.2. ERP data The baseline-corrected ERPs in the 11 scalp regions are plotted in Figs. 2.1 and 2.2 for the photograph and linedrawing tasks, respectively. The time scale in the figures begins at the onset of the central eye-fixation cross (1000 ms) and ends at 1000 ms post-stimulus onset. Both groups show ERP increases in the frontal and decreases in the occipital– parietal regions following the onset of the eye-fixation cross, which are consistent with recent fMRI findings suggesting BOLD increases in the frontal and decreases in the parietal regions during the central eye-fixation period (Liou et al., 2012). In the photograph task, a strong N400 component is clearly induced by facial stimuli, and more pronounced in the frontal regions, which is, however, unique to the control group. The N170 components are comparable between the AS and control groups. In the control group, the N400 component is pronounced in the frontal, temporal and occipital–parietal regions in the photograph task, but only in the frontal regions in the line-drawing task. In the AS group, the N400 component is visible in the midline frontal region, but invisible in other regions in the photograph task; similar to the control group, the N400 component becomes visible in all frontal regions in the line-drawing task. The MANOVA results in a few selected time intervals are reported in Table 2. In the 50–150 ms post-stimulus interval, the task by group interaction and region effects are significant. The post hoc comparison using Turkey’s LSD test indicates that the ERP amplitude in the frontal and temporal regions is significantly larger in the photograph task than in the line-drawing task in the control group. The AS group has similar ERP amplitude as the control group in the line-drawing task, but has much smaller ERP amplitude in the photograph task. In the 350–450 ms interval, the control group shows strong ERP decreases (N400) in the frontal, temporal and occipital–parietal regions in the photograph task, in which the AS group only shows minor ERP decreases in the frontal and left-temporal regions along with strong ERP increases in the occipital–parietal regions. However, the two groups have similar ERP activity in the line-drawing task. In summary, the two groups show significant differences in the early perception (50–150 ms) and later recognition of emotional faces (350–450 ms) in the photograph task (Toivonen & [19_TD$IF]Rama, 2009), and have comparable ERP patterns in the line-drawing task. Photograph and line-drawing faces reach the largest ERP difference in the temporal and occipital–parietal regions in the 250–550 ms interval. The ERP differences between scalp regions are statistically significant beginning at the onset of central-fixation cross (1000 ms) and terminating at 550 ms post-stimulus onset. The ERP amplitude tends to be uniformly distributed in all scalp regions after 550 ms.

38

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

3.3. ERSP data Figs. 3.1–3.4 present the plots of the baseline-corrected ERSPs located in different scalp regions in the photograph and line-drawing tasks for the two groups. In the figures, the vertical axis depicts the frequency within the 1–35 Hz range, and the horizontal axis represents the time interval in the 1000 to 1000 ms range. In the photograph task, the control group shows strong delta/theta synchronization in the temporal, central and occipital–parietal regions in the 150–250 ms interval. The angry and happy faces mainly contribute to the delta/theta synchronization, especially in the temporal and frontal regions. In

[(Fig._21)TD$IG]

Fig. 2.1. ERP plots in different scalp regions for the control and AS groups in the photograph task. Locations of EEG channels are shown in the left-hand side of each plot.

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

39

the same interval, the AS group does not show comparable delta/theta synchronization in the frontal and temporal regions, but has stronger beta desynchronization in the frontal and temporal regions starting at 200 ms compared with the control group in the photograph task. In the line-drawing task, however, the two groups tend to display more comparable ERSP patterns.

[(Fig._2)TD$IG]

Fig. 2.2. ERP plots in different scalp regions for the control and AS groups in the line-drawing task.

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

40

Table 2 The MANOVA results on ERPs and ERSPs computed for the emotional face recognition tasks at different time intervals. The main effects include group (GRP), task (TSK), and region (RGN), and the interaction effects include TSK-by-GRP, RGN-by-GRP, TSK-by-RGN, and TSK-by-GRP-by-RGN (no significant ERP or ERSP in all intervals), respectively. The listed F values are those smaller than a = 0.05. Intervals (ms)

TSK

GRP

RGN

TSK-by-GRP

TSK-by-RGN

RGN-by-GRP

MANOVA

F1,16

F1,16

F10,7

F1,16

F10,7

F10,7

7.182 p = .008 6.152 p = .012

5.314 p = .035

Event-related potentials [50–150] [150–250] [250–350]

18.929 p < 0.001

17.831 p < .001

[350–450]

8.326 p = .005

Event-related spectral perturbations (1–7 Hz) [50–150] [150–250] [250–350]

5.739 p = .029

[350–450] [450–800]

6.851 p = 0.019

4.601 p = 048 14.744 p = .033 19.023 p = 0.011 19.860 p = .022

5.744 p = .015

4.665 p = 026 3.859 p = 0.043

11.725 p = .003 7.366 p = 0.015

6.322 p = .023

Event-related spectral perturbations (8–30 Hz) [200–800]

6.769 p = .009

We provide MANOVA tests on ERSPs in a few selected time intervals in Table 2 for slow-wave (delta/theta) and fastwave (alpha/beta) oscillations. In the 50–150 ms interval, the control group shows stronger delta/theta synchronization in the occipital–parietal and right temporal regions than the AS group in the photograph task. In the line-drawing task, however, the AS group demonstrates stronger delta/theta synchronization in the occipital–parietal, midline/right frontal and right temporal regions when compared to the control group. The control group has additive effects of photograph faces over the line-drawing faces in delta/theta synchronization, whereas the AS group fails to exhibit this additive effect and shows stronger delta/theta synchronization in the midline and right frontal regions in the linedrawing task. In the 150–250 ms interval, the control group displays much stronger delta/theta synchronization than the AS group especially in the occipital–parietal regions in the photograph task. On the average, the delta/theta synchronization is stronger in the occipital–parietal regions compared to other scalp regions, which is the same for both groups. In the 250– 450 ms interval, the control group displays an additive effect of photograph faces on delta/theta synchronization, whereas the AS group does not have the comparable effect in the photograph task especially for delta/theta synchronization in the temporal region. In the 450–800 ms interval, the additive effect vanishes in the frontal regions in the control group, whereas the AS group begins to display the additive effect in the temporal regions. In summary, delta/theta synchronization in the control group is pronounced in the 50–800 ms interval in both tasks. The occipital–parietal regions display strongest synchronization, followed by the central and temporal regions and then by the frontal regions in the early period (50–350 ms). The regional difference is reduced after 350 ms post-stimulus onset, and becomes real small after 550 ms. The occipital–parietal regions also demonstrate the strongest alpha/beta desynchronization in the 200–800 ms interval. In general, the photograph faces have an additive effect over the line-drawing faces in the delta/theta range, but the line-drawing faces induce stronger alpha/beta desynchronization. Fig. 4 plots the histograms for delta/theta synchronization in the 150–250 ms and 350–450 ms along with the alpha/beta desynchronization in the 200– 800 ms interval. The histograms illustrate the additive effect of photograph faces on delta/theta synchronization (i.e., Fig. 4(A) and (C)), and the reduced region effect on delta/theta synchronization (i.e., Fig. 4(B)). On the other hand, the AS group has more comparable delta/theta synchronization as the control group in the line-drawing task, but no apparent additive effect associated with the photograph faces. As shown in Fig. 4, the AS group has stronger delta/theta synchronization in the photograph task (1) in the occipital–parietal regions in the 150–250 ms interval, and (2) in the midline frontal, midline central and midline occipital–parietal regions in the 350–450 ms interval, relative to other regions. The alpha/beta desynchronization in the AS group is similar to that of the control group (and slightly stronger) in both photograph and linedrawing tasks. Unlike the control group, the alpha/beta desynchronization in the AS group shows similar amplitude in the two tasks.

[(Fig._31)TD$IG]

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

41

Fig. 3.1. ERSP plots for the control group in the photograph task. The red color denotes power increase (synchronization), and the blue color denotes power decrease (desynchronization) compared with the baseline[1_TD$IF].

[(Fig._32)TD$IG]

42

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

[2_TD$IF]Fig. 3.2. ERSP plots for the AS group in the photograph task.

[(Fig._3)TD$IG]

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

Fig. 3.3. ERSP plots for the control group in the line-drawing task.

43

[(Fig._34)TD$IG]

44

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

Fig. 3.4. ERSP plots for the AS group in the line-drawing task.

[(Fig._4)TD$IG]

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

45

Fig. 4. The ERSP distributions in different scalp regions during (A) 150–250 ms and (B) 350–450 ms post-stimulus onset in the 1–7 Hz range, and (C) 250– 1000 ms in the 8–30 Hz range (LF: left frontal; MF: midline frontal; RF: right frontal; LT: left temporal; RT: right temporal; LC: left central; MC: midline central; RC: right central; LP: left parietal–occipital; MP: midline parietal–occipital; RP: right parietal–occipital). Note: The red arrows in (B) point to the midline regions in which the AS group has stronger delta/theta synchronization relative to other regions[1_TD$IF].

3.4. Phase synchronization We plot the scalp distributions of PLVs in the eight clusters of EEG channels in Fig. 5 according to different time intervals, tasks, and groups, respectively, for the delta/theta range. The alpha/beta range has similar results as those in Fig. 5 and is omitted for simplicity. It is well-known that the right hemisphere is more specialized in face processing than the left hemisphere in humans (Gainotti, 2007) and in many other non-human mammals including sheep (Broad, Mimmack, & Kendrick, 2000). Right hemisphere specialization can be found in face recognition tasks both in pre- and post-recognition time interval (Lee, Simos, Sawrie, Martin, & Knowlton, 2005). The stronger involvement of right hemisphere in facial processing is accompanied by more noticeable specialization between regions within this hemisphere. In our results, more functional specialization between different regions was discovered in the right rather than left hemisphere (i.e., channels in the right frontal and right temporal are separated into two clusters even during the baseline condition), supporting the results by others. In the figure, larger PLVs in the baseline condition are mainly distributed around the EEG channels within each cluster, possibly suggesting stronger local rather than distant connections. The randomized pvalues indicate that there is no significant difference between the AS and control groups in the baseline condition. The baseline adjusted PLVs in the 200–500 ms interval show enhanced long-distance connections between regions because larger PLVs are distributed away from the EEG channels independent of frequency ranges, which by no means implies that there is no local connection during the post-stimulus interval. The local connections become invisible after adjusting for the baseline PLVs, which are more pronounced locally. Fig. 5 indicates that the photograph task demands the inter-hemispheric and strong distant connections between parietal and other regions such as the frontal, central, and temporal regions in the low frequency range. In the photograph task, the AS group has much weaker inter-hemispheric connections and weaker distant connections between the parietal and other regions in the low frequency range compared with the control group. The randomized p-values also indicate significant group differences in the eight EEG clusters. The line-drawing task mainly demands inter-hemispheric connections, and the AS group begins to show distant connections as the control group. The randomized p-values also indicate that there is no significant group difference in the line-drawing task.

[(Fig._5)TD$IG]

46

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

[2_TD$IF]Fig. 5. The average phase-lock values (PLVs) in the 1–7 Hz range during the baseline period (i.e., 2000 to 1200 ms pre-stimulus onset), and during the 200–500 ms post-stimulus onset in the photograph and line-drawing tasks for the eight clusters of EEG channels. Note: The EEG channels corresponding to each cluster are indicted as black dots inside of each scalp map. The averaged PLVs of individual clusters are plotted in (A) for the AS group and (B) for the control group during the baseline period. Because baseline PLVs in the photograph and line-drawing tasks are similar. Those listed in the figure are the average between the two tasks. The baseline adjusted PLVs during the 200–500 ms interval in the photograph task are plotted in (C) for the AS group and (D) for the control group. The randomized p-value is listed above each cluster, which indicates if the AS and control group differs significantly on the averaged PLVs for channels classified into a particular cluster. The baseline adjusted PLVs during the 200–500 ms interval in the line-drawing task are plotted in (E) for the AS group and (F) for the control group.

In summary, both tasks demand local and inter-hemispheric connections in the 200–500 ms post-stimulus interval. The photograph task additionally requires distant connections between the parietal region and other scalp regions. The AS group has strong local connections as the control group, but no visible long-distance connections in the photograph task in both low and high frequency ranges. The AS group displays comparable inter-hemispheric connections as the control group in the line-drawing task. 4. Discussion A cohort of pretesting participants was included in the EEG experiment to assist with selecting the experimental stimuli and emotionality evaluation scale. The face stimuli were chosen by minimizing the response differences between the pretested AS and control participants. The evaluation scale and duration of central eye fixation period were designed to facilitate behavioral responses of AS participants. A study on a single AS patient found that eyes in a face were less diagnostic than the mouth; the participant had difficulty using HSF information in the eye regions (Curby, Schyns, Gosselin, & Gauthier, 2003). We dealt with this potential issue by selecting stimuli that contain emotional expressions identifiable by exposed/ unexposed teeth or furrowed/smoothed eyebrows. The MANOVA results suggest that the experimental design was successful in equating the IQ and behavioral responses between the AS and control groups, such that confounding effects unrelated to facial emotion recognition could possibly be controlled for during the EEG experiment. The significant behavioral differences between the AS and control groups are only observed in the average scores assigned to the neutral faces in both tasks, and the AS group has shorter reaction times to the neutral faces on the average than the control group. The amygdala and its associated limbic structures participate in memory and retrieval of information relevant to emotions, except for the neutral emotion (Ilyutchenok, 1981; Kleinhans et al., 2011). The behavioral data suggest a crucial role that the amygdala and its associated limbic structures play in the interpretation of behavioral responses in the AS group.

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

47

The recognition of facial emotions in healthy controls includes both cognitive and affective processes (Balconi, 2012; Balconi & Lucchiari, 2007; Knyazev, Bocharov, Levin, Savostyanov, & Slobodskoj-Plusnin, 2008). The affective process reflects feelings such as empathy and is related to subjective experiences with emotional expressions of others. The cognitive process is related to recognition of faces through attention control to details under conscious awareness. The affective process is generally faster than the cognitive process in healthy individuals. The literature suggests that AS patients remain intact in cognitive functions, but are inferior in affective processes (Duverger, Da Fonseca, Bailly, & Deruelle, 2007; Holroyd & BaronCohen, 1993). Moreover, the reaction-time differences between the two tasks tend to be larger in the control group than in the AS group. It is reasonable to hypothesize that the control group engaged the cognitive process more than the affective one in the line-drawing task and exerted both processes in the photograph task, whereas the AS group relied only on the cognitive process in both tasks. 4.1. Smaller P1, comparable N170 and invisible N400 The ERP results suggest that the two groups differ in the early perception (50–150 ms) and later semantic recognition (350–450 ms) of emotional faces in the photograph task. The AS group has a smaller P1 amplitude in the photograph task and a slightly larger P1 amplitude in the line-drawing task compared with the control group. Previous studies found a delayed P1 pronounced in the parietal–occipital region in AS patients, which was interpreted as an index of perceptual deficits or a distinct modality involved in processing early visual-sensory information (O’Connor et al., 2005, 2007). The amplitude differences in the P1 between processing photographs and line-drawing faces may have reflected the uniqueness of AS participants in perception of HSFs and LSFs (Deruelle et al., 2004). In our interpretation, the AS participants have sensory deficits in the perception of the coarse part in a face stimulus, which is critical for recognition of emotional expressions, but they have no such deficits in recognition of line-drawing faces. While other studies found a delayed N170 in AS patients (O’Connor et al., 2005, 2007), the AS group in our study does not show a significant delay in the N170 compared with the control group. The N170 was considered to be induced by the presence of a human face in the visual field without direct relationship with face recognition. It has been shown that the N170 is unaffected by manipulations of known to irregular faces, such as photograph and contour faces (Bentin & Deouell, 2000). Since the face stimuli selected in our study elicited similar behavioral responses between the AS and control participants, our findings are closely in agreement with findings based on healthy controls given in the literature when comparing ERPs between the two tasks and between the two groups in the 150–250 ms interval. Previous EEG studies on AS patients have focused on the P1 and N170. A contribution we make is in demonstrating a significant difference in the N400 component between the two groups. In contrast to the N170, the later N400 component is shown to be strongly affected by the emotional content, familiarity and global/local features in faces (Bentin & Deouell, 2000). In our study, the N400 (350–450 ms) in the frontal and temporal regions is highly visible in the control group and almost invisible in the AS group in the photograph task. Semantic features in words or sentences are known to modulate the N400, reflecting a post-lexical process of semantic integration (Rugg, 1990). The N400 absence under recognition of language stimuli is interpreted as an index of the failure to associate several components in a stimulus in the complete semantic structure (Colin, Zuinen, Bayard, & Leybaert, 2013). In facial emotion recognition, the N400 could be interpreted as a process of searching for a link between a face and its semantic interpretation in terms of different emotion categories (angry, neutral and happy). In the control group, the ERP difference between the two tasks in the 350–450 ms interval is consistent with findings by others. The amygdala is more active to intact fearful faces or fearful faces containing only LSFs (Vuilleumier et al., 2003; Vuilleumier & Pourtois, 2007). As the most LSF or coarse part was removed from the face stimulus in the line-drawing task, findings from the control group indicate that the N400 is much smaller in the occipital–parietal region and almost invisible in the temporal regions compared with that in the photograph task. Because information processing of line-drawing faces depends less on the non-conscious function in the amygdala, the AS participants show more comparable ERP patterns as the healthy controls in the later (350–450 ms) stages during emotional face recognition. According to the behavioral data, the N400 could partially determine the efficiency of associating and retrieving of semantic links to faces during emotionality evaluation, rather than the response accuracy because the AS group accomplished the emotionality evaluation task correctly without the visible N400 in the photograph task. How nonconscious motivation and affective information improves the efficiency in the cognitive system is an intriguing question and remains to be answered (Milyavsky, Hassin, & Schul, 2012). Since the coarse part was removed from the line-drawing faces, the N400 amplitude is reduced in the control group such that the two groups have comparable ERPs. While the LSFs mainly activates the non-conscious structures such as the amygdala, pulvinar and superior colliculus in perception of emotions (Tamietto & De Gelder, 2010), the reduced N400 suggests that information processing through the amygdala and its associated limbic structures could play a crucial role in triggering the amplitude of the N400 component. 4.2. Weaker delta/theta synchronization ERSPs further indicate that the AS group has much weaker synchronization in delta/theta rhythms in the early and later stages of emotional face recognition. In the literature, both animal and human EEG experiments suggest that theta synchronization is associated with limbic-cortical connections (Molle, Marshall, Fehm, & Born, 2002; Pare, 2003; Pare & Collins, 2000; Seidenbecher, Laxmi, Stork, & Pape, 2003). Delta oscillation represents a motivational state also related to

48

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

limbic-cortical connections (Knyazev & Slobodskoj-Plusnin, 2007; Knyazev, Slobodskoj-Plusnin, & Bocharov, 2009). In humans, the non-conscious representation of emotional and motivational significance in faces is mediated by the amygdale and obitofrontal cortex (Balconi & Pozzoli, 2008, 2009; Whalen et al., 1998) in direct contrast to the conscious representation, which is pronounced in the anterior cingulate as well as prefrontal and somatosensory cortices. Moreover, delta/theta synchronization is more associated with non-conscious than with conscious face recognition (Basar, Guntekin, & Oniz, 2006). Therefore, weaker delta/theta synchronization suggests deficits in non-conscious processing of emotional expressions and a failure in the limbic-cortical projection in the AS patients. Fig. 4 shows that delta/theta synchronization is slightly more pronounced in the midline frontal, midline central and midline occipital–parietal regions relative to other scalp regions in the AS group in the 350–450 ms interval in both tasks. These midline regions are closely related to the aforementioned cortical structures associated with conscious representation of emotional significance (Balconi, 2012). Because the cognitive or conscious pathway is still mediated by the limbic structure such as the thalamus, we can hypothesize that the AS group counted on the conscious pathway more than the nonconscious pathway when responding to photograph and line-drawing faces. In the control group, the delta/theta power is the strongest in the parietal–occipital regions time-locked to stimulus onset and increases in the frontal regions at a later stage in the photograph task. However, the spatial distribution of delta/theta power in the line-drawing task becomes closer to that in the AS group. We hypothesize that the control participants engaged the conscious and non-conscious pathways in the photograph task, and counted on the conscious pathway in the line-drawing task. When comparing ERSPs between the two tasks, the control group additionally suggests an additive effect on delta/theta synchronization during the 50–450 ms post-stimulus interval, independent of brain regions and those mechanisms elicited by facial emotions. The coarse part in a face seems to place a constant load on the information flow, which can be easily bypassed through voluntary attention to details in a face stimulus as is suggested by the AS participants who could evaluate facial emotions successfully in the photograph task. Strong alpha and beta oscillations indicate functional processes in the neocortex associated with attention, semantic long-term memory, cognitive estimations of stimuli (Anokhin & Vogel, 1996; Klimesch, 1999). In the face recognition task, alpha/beta desynchronization reflects the level of voluntary attention to visual stimuli and is associated with cognitive appraisal of facial emotions (Balconi, 2012; Knyazev et al., 2008, 2009). In our study, we find no evidence of a task or group effect in higher frequency oscillations (alpha and beta) except for regional differences, when comparing the difference between the parietal–occipital region and other regions. Alpha desynchronization reflects attention and a release from inhibited processes in complicated tasks (Klimesch, Sauseng, & Hanslmayr, 2007). Beta oscillation, however, is seldom observed in emotion-related tasks (Balconi, Brambilla, & Falbo, 2009; Knyazev & Slobodskoj-Plusnin, 2007). A previous EEG experiment discussed stronger beta2 desynchronization in the later stage in AS participants compared to the healthy controls (Yang et al., 2011). In Fig. 3, our results indicate that beta desynchronization in the AS group is generally stronger than in the control group in both tasks, but that the differences are insignificant. The ERSPs suggest that the AS group has much weaker delta/theta power, but stronger alpha/beta power compared to the control group. We hypothesize that AS participants could direct their attention to some important details in faces by use of cognitive appraisal of visual stimuli to compensate for sensory and affective deficits. 4.3. Impaired distant connections between scalp regions The AS group lacks distant connections among scalp regions in the photograph task, but have similar inter-hemispheric connections as the control group in the line-drawing task. Control participants show inter-hemispheric connections and distant connections between the parietal region and all other regions in the photograph task. In emotional face recognition, the parietal region is responsible for comparing between visual images and images in memory (O’Connor, Han, & Dobbins, 2010), and has functional connections with the prefrontal, temporal (left and right), parietal regions as well as the limbic structures (Leung & Alain, 2011; Sestieri, Corbetta, Romani, & Shulman, 2011). The functional complexity in the amygdala has been manifested in its vast connections with other brain structures; for instance, the amygdala has projections to much of the neocortex such as the temporal gyrus, supra-marginal gyrus, inferior parietal lobule, temporo-parietal junction and precuneus (Adolphs, 2002). Several of these regions are linked to emotion processing and high level social functions. Individuals with schizophrenia were found to have reduced connectivity from the amygdala to areas including the parietal lobe and precuneus during processing fearful facial expressions (Mukherjee et al., 2012). Reduced connectivity between the amygdala and parietal regions is suggested to contribute to the abnormalities in social behavior. Although the amygdala and its associated limbic structures participate in both conscious and non-conscious processes, a failure in the non-conscious processing of emotional faces could affect the distant connections between the parietal region and other scalp regions in the AS participants especially in the photograph task. It is well-known that inter-hemispheric interaction facilitates face processing (Compton, Wilson, & Wolf, 2004). According to many different findings, the configural processing of faces is mediated by the right hemisphere, while the analytical processing is mediated by the left hemisphere (Bourne, 2005; Parkin & Williamson, 1987; Rhodes, 1993; Ross & Turkewitz, 1981). Additionally, connectivity between the right and left hemispheres has been studied in patients with impaired emotional perception, such as schizophrenia and alexithymia (Aftanas, Varlamov, Reva, & Pavlov, 2003; Kano et al., 2003; Kohler, Walker, Martin, Healey, & Moberg, 2010; Schafer et al., 2007; Williams et al., 2007). A decrease in interhemispheric connections is reported to be associated with patients with disorders in emotion perception, despite their intact

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

49

cognitive ability (Chernigovskaya, 2009; Chernigovskaya & Deglin, 1986). Neuroimaging studies have found weak connectivity in ASD patients within the frontal, temporal, and occipital regions (Kana, Keller, Cherkassky, Minshew, & Just, 2009; Solomon et al., 2009), between the frontal and parietal regions (Just, Cherkassky, Keller, Kana, & Minshew, 2007; Kana, Keller, Cherkassky, Minshew, & Just, 2006; Liu, Cherkassky, Minshew, & Just, 2011), and inter-hemispheric distant connections (Anderson et al., 2010). Unlike neuroimaging studies on ASD, our study demonstrates that AS participants have normal within-region connectivity in both photograph and line-drawing tasks, and normal inter-hemispheric connections in the line-drawing task. The long-range connection between hemispheres has been considered an index of the integration processes in the associative cortex, including the superior parietal lobule, frontal insula and posterior lateral frontal lobe, where activities are strengthened during information integration (Anderson et al., 2010). The AS participants display weak inter-hemispheric connections in the photograph task, which may be a result of deficits in the integration of affective and cognitive information. 5. Conclusion This study included a cohort of pretesting participants for planning the experiment and used a high-resolution EEG system recording evoked potential data. By carefully adjusting EEG artifacts, the preprocessed data were analyzed with different methods. The results of behavioral data, ERPs, ERSPs and phase synchronization converge to the following findings: AS patients have deficits in affective control of emotion-related information, which impacts the response efficiency in emotionality evaluations, but not the accuracy. The AS patients counted on the voluntary control of attention to some details in faces for maintaining equivalent behavioral responses as the healthy controls in the two tasks. The amygdala and its associated limbic structures are the key to understanding emotional and social related problems in AS patients especially in AS children. The neurobiological explanation of consciousness is a most intriguing topic for modern researchers. Despite a long interest, we are still unclear over what consciousness is precisely and whether it differs from other information processing in the brain such as sensory perception or affective experience, in part because of a lack of perfect paradigms disassociating the conscious process from non-conscious one. In the last decade, research interest has grown considerably due partly to the continuing efforts by Crick and Koch (Crick & Koch, 1990, 1995, 1998). Based on our research findings, it is possible to associate consciousness with voluntary attention to external stimuli (or internal states) allowing for perception of a particular part of the stimulus by ignoring others; it also allows for subjective appraisal and re-appraisal of stimuli according to motivational changes. Furthermore, consciousness is related to the use of semantic estimations of stimuli when the sensory signals need to be interpreted by language categories. An important contribution of this study is to clarify the interplay between conscious and non-conscious controls in facial emotion recognition by comparing brain evoked potentials between AS adults and IQ-matched healthy controls. Except for neutral stimuli, the coarse part in a face is processed in the early stage in the 50–150 ms interval, which could be the starting zone of the non-conscious processing of affective information. The cognitive and affective information are integrated in the later stage in the 350–450 ms interval for semantic interpretation of facial emotions. The non-conscious portion of the affective information only contributes an additive effect to the integrated information. We hypothesize that the amygdala and its associated limbic structures might be the center of the non-conscious circuitry in facial emotion recognition that retrieves useful information in working and long-term memory such that emotionality evaluation can be accomplished more efficiently. We conclude this study by recommending the experimental paradigm of incorporating the unique features of AS children and adults for probing information processing in the brain at the conscious and non-conscious levels in emotional and language networks[7_TD$IF]. Acknowledgements This research was supported by grants NSC101-2811-H-001-020 and NSC101-2410-H-001-032. References Adolphs, R. (2002). Neural systems for recognizing emotion. Current Opinion in Neurobiology, 12, 169–177. Aftanas, L. I., Varlamov, A. A., Reva, N. V., & Pavlov, S. V. (2003). Disruption of early event-related theta synchronization of human EEG in alexithymics viewing affective pictures. Neuroscience Letters, 340, 57–60. American Psychiatric Association (1994). Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition). Arlington, VA: American Psychiatric Publishing. American Psychiatric Association (2013). Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition). Arlington, VA: American Psychiatric Publishing. Anderson, J. S., Druzgal, T. J., Froehlich, A., DuBray, M. B., Lange, N., Alexander, A. L., et al. (2010). Decreased interhemispheric functional connectivity in autism. Cerebral Cortex bhq190. Anokhin, A., & Vogel, F. (1996). EEG alpha rhythm frequency and intelligence in normal adults. Intelligence, 23, 1–14. Ashwin, C., Baron-Cohen, S., Wheelwright, S., O’Riordan, M., & Bullmore, E. T. (2007). Differential activation of the amygdala and the ‘social brain’ during fearful face-processing in Asperger syndrome. Neuropsychologia, 45, 2–14. Aydore, S., Pantazis, D., & Leahy, R. M. (2013). A note on the phase locking value and its properties. NeuroImage, 74, 231–244. Balconi, M. (2012). Neuropsychology of facial expressions. The role of consciousness in processing emotional faces. Neuropsychological Trends, 11, 19–40. Balconi, M., Brambilla, E., & Falbo, L. (2009). Appetitive vs. defensive responses to emotional cues, autonomic measures and brain oscillation modulation. Brain Research, 1296, 72–84. Balconi, M., & Lucchiari, C. (2007). Consciousness and emotional facial expression recognition – Subliminal/supraliminal stimulation effect on n200 and p300 ERPs. Journal of Psychophysiology, 21, 100–108.

50

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

Balconi, M., & Pozzoli, U. (2008). Event-related oscillations (ERO) and event-related potentials (ERP) in emotional face recognition: A regression analysis. International Journal of Neuroscience, 118, 1412–1424. Balconi, M., & Pozzoli, U. (2009). Arousal effect on emotional face comprehension frequency band changes in different time intervals. Physiology & Behavior, 97, 455–462. Baron-Cohen, S., Lombardo, M. V., Auyeung, B., Ashwin, E., Chakrabarti, B., & Knickmeyer, R. (2011). Why are autism spectrum conditions more prevalent in males? PLoS Biology, 9, e1001081. Basar-Eroglu, C., Kolev, V., Ritter, B., Aksu, F., & Basar, E. (1994). EEG, auditory evoked potentials and evoked rhythmicities in three-year-old children. International Journal of Neuroscience, 75, 239–255. Basar, E., Guntekin, B., & Oniz, A. (2006). Principles of oscillatory brain dynamics and a treatise of recognition of faces and facial expressions. Event-Related Dynamics of Brain Oscillations, 159, 43–62. Baskin, J. H., Sperber, M., & Price, B. H. (2006). Asperger syndrome revisited. Reviews in Neurological Diseases, 3, 1–7. Behrmann, M., Avidan, G., Leonard, G. L., Kimchi, R., Luna, B., Humphreys, K., et al. (2006). Configural processing in autism and its relationship to face processing. Neuropsychologia, 44, 110–129. Behrmann, M., Thomas, C., & Humphreys, K. (2006). Seeing it differently: visual processing in autism. Trends in Cognitive Sciences, 10, 258–264. Bentin, S., & Deouell, L. Y. (2000). Structural encoding and identification in face processing: ERP evidence for separate mechanisms. Cognitive Neuropsychology, 17, 35–55. Bourne, V. J. (2005). Lateralised processing of positive facial emotion: Sex differences in strength of hemispheric dominance. Neuropsychologia, 43, 953–956. Broad, K. D., Mimmack, M. L., & Kendrick, K. M. (2000). Is right hemisphere specialization for face discrimination specific to humans? European Journal of Neuroscience, 12, 731–741. Cantor, D. S., Thatcher, R. W., Hrybyk, M., & Kaye, H. (1986). Computerized EEG analyses of autistic children. Journal of Autism and Developmental Disorders, 16, 169– 187. Chernigovskaya, T. (2009). From communication signals to human language and thought: Evolution or revolution? Neuroscience and Behavioral Physiology, 39, 785–792. Chernigovskaya, T. V., & Deglin, V. L. (1986). Brain functional asymmetry and neural organization of linguistic competence. Brain and Language, 29, 141–153. Chiang, S. K., Tam, W. C., Pan, N. C., Chang, C. C., Chen, Y. C., Pyng, L. Y., et al. (2007). The appropriateness of Blyler’s and four subtests of the short form of the Wechsler Adult Intelligence Scale-III for chronic schizophrenia. Taiwanese Journal of Psychiatry, 21, 26–36. Coben, R., Mohammad-Rezazadeh, I., & Cannon, R. L. (2014). Using quantitative and analytic EEG methods in the understanding of connectivity in autism spectrum disorders: A theory of mixed over-and under-connectivity. Frontiers in Human Neuroscience, 8. Coben, R., & Myers, T. E. (2008). Connectivity theory of autism: Use of connectivity measures in assessing and treating autistic disorders. Journal of Neurotherapy, 12, 161–179. Colin, C., Zuinen, T., Bayard, C., & Leybaert, J. (2013). Phonological processing of rhyme in spoken language and location in sign language by deaf and hearing participants: A neurophysiological study. Neurophysiologie Clinique, Clinical Neurophysiology43, 151–160. Compton, R. J., Wilson, K., & Wolf, K. (2004). Mind the gap: Interhemispheric communication about emotional faces. Emotion, 4, 219. Crick, F., & Koch, C. (1990). Towards a neurobiological theory of consciousness. In Seminars in the neurosciences (Vol. 2, pp. 263–275). Saunders Scientific Publications. Crick, F., & Koch, C. (1995). Are we aware of neural activity in primary visual cortex? Nature, 375, 121–123. Crick, F., & Koch, C. (1998). Consciousness and neuroscience. Cerebral Cortex, 8, 97–107. Curby, K. M., Schyns, P. G., Gosselin, F., & Gauthier, I. (2003). Face-selective fusiform activation in Asperger’s Syndrome: A matter of tuning to the right (spatial) frequency. Poster presented at Cognitive Neuroscience, New York. Dakin, S., & Frith, U. (2005). Vagaries of visual perception in autism. Neuron, 48, 497–507. Dawson, G., Carver, L., Meltzoff, A. N., Panagiotides, H., McPartland, J., & Webb, S. J. (2002). Neural correlates of face and object recognition in young children with autism spectrum disorder, developmental delay, and typical development. Child Development, 73, 700–717. Dawson, G., Warrenburg, S., & Fuller, P. (1982). Cerebral lateralization in individuals diagnosed as autistic in early childhood. Brain and Language, 15, 353–368. Dawson, G., Webb, S. J., & McPartland, J. (2005). Understanding the nature of face processing impairment in autism: Insights from behavioral and electrophysiological studies. Developmental Neuropsychology, 27, 403–424. Deeley, Q., Daly, E. M., Surguladze, S., Page, L., Toal, F., Robertson, D., et al. (2007). An event related functional magnetic resonance imaging study of facial emotion processing in Asperger syndrome. Biological Psychiatry, 62, 207–217. Delorme, A., & Makeig, S. (2004). EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134, 9–21. Deruelle, C., Rondan, C., Gepner, B., & Tardif, C. (2004). Spatial frequency and face processing in children with autism and Asperger syndrome. Journal of Autism and Developmental Disorders, 34, 199–210. Deruelle, C., Rondan, C., Salle-Collemiche, X., Bastard-Rosset, D., & Da Fonseca, D. (2008). Attention to low- and high-spatial frequencies in categorizing facial identities, emotions and gender in children with autism. Brain and Cognition, 66, 115–123. Duverger, H., Da Fonseca, D., Bailly, D., & Deruelle, C. (2007). Theory of mind in Asperger syndrome. L’Encephale, 33, 592–597. Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologist Press. Gainotti, G. (2007). Face familiarity feelings, the right temporal lobe and the possible underlying neural mechanisms. Brain Research Reviews, 56, 214–235. Gillberg, C. (1991). Clinical and neurobiological aspects of Asperger’s syndrome in six families studied. In Autism and Asperger’s syndrome (pp. 122–146). Cambridge: Cambridge University Press. Gross, T. F. (2004). The perception of four basic emotions in human and nonhuman faces by children with autism and other developmental disabilities. Journal of Abnormal Child Psychology, 32, 469–480. Harms, M. B., Martin, A., & Wallace, G. L. (2010). Facial emotion recognition in autism spectrum disorders: A review of behavioral and neuroimaging studies. Neuropsychology Review, 20, 290–322. Holroyd, S., & Baron-Cohen, S. (1993). Brief report: How far can people with autism go in developing a theory of mind? Journal of Autism and Developmental Disorders, 23, 379–385. Iidaka, T., Yamashita, K., Kashikura, K., & Yonekura, Y. (2004). Spatial frequency of visual image modulates neural responses in the temporo-occipital lobe. An investigation with event-related fMRI. Cognitive Brain Research, 18, 196–204. Ilyutchenok, R. Y. (1981). Emotions and conditioning mechanisms. Integrative Physiological and Behavioral Science, 16, 194–203. Johansson, M. (1999). The Hilbert transform. Vaxjo University. Johnson, M. H., Griffin, R., Csibra, G., Halit, H., Farroni, T., De Haan, M., et al. (2005). The emergence of the social brain network: Evidence from typical and atypical development. Development and Psychopathology, 17, 599–619. Just, M. A., Cherkassky, V. L., Keller, T. A., Kana, R. K., & Minshew, N. J. (2007). Functional and anatomical cortical underconnectivity in autism: Evidence from an fMRI study of an executive function task and corpus callosum morphometry. Cerebral Cortex, 17, 951–961. Kana, R. K., Keller, T. A., Cherkassky, V. L., Minshew, N. J., & Just, M. A. (2006). Sentence comprehension in autism: Thinking in pictures with decreased functional connectivity. Brain, 129, 2484–2493. Kana, R. K., Keller, T. A., Cherkassky, V. L., Minshew, N. J., & Just, M. A. (2009). Atypical frontal-posterior synchronization of theory of mind regions in autism during mental state attribution. Social Neuroscience, 4, 135–152. Kano, M., Fukudo, S., Gyoba, J., Kamachi, M., Tagawa, M., Mochizuki, H., et al. (2003). Specific brain processing of facial expressions in people with alexithymia: An (H2O)-O-15-PET study. Brain, 126, 1474–1484. Kleinhans, N. M., Richards, T., Johnson, L. C., Weaver, K. E., Greenson, J., Dawson, G., et al. (2011). fMRI evidence of neural abnormalities in the subcortical face processing system in ASD. NeuroImage, 54, 697–704.

Y.-L. Tseng et al. / Research in Autism Spectrum Disorders 13–14 (2015) 32–51

51

Klimesch, W. (1999). EEG alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Research Reviews, 29, 169–195. Klimesch, W., Sauseng, P., & Hanslmayr, S. (2007). EEG alpha oscillations: The inhibition-timing hypothesis. Brain Research Reviews, 53, 63–88. Knyazev, G. G., Bocharov, A. V., Levin, E. A., Savostyanov, A. N., & Slobodskoj-Plusnin, J. Y. (2008). Anxiety and oscillatory responses to emotional facial expressions. Brain Research, 1227, 174–188. Knyazev, G. G., & Slobodskoj-Plusnin, J. Y. (2007). Behavioural approach system as a moderator of emotional arousal elicited by reward and punishment cues. Personality and Individual Differences, 42, 49–59. Knyazev, G. G., Slobodskoj-Plusnin, J. Y., & Bocharov, A. V. (2009). Event-related delta and theta synchronization during explicit and implicit emotion processing. Neuroscience, 164, 1588–1600. Kohler, C. G., Walker, J. B., Martin, E. A., Healey, K. M., & Moberg, P. J. (2010). Facial emotion perception in schizophrenia: A meta-analytic review. Schizophrenia Bulletin, 36, 1009–1019. Lachaux, J. P., Rodriguez, E., Martinerie, J., & Varela, F. J. (1999). Measuring phase synchrony in brain signals. Human Brain Mapping, 8, 194–208. Lazarev, V. V., Pontes, A., & deAzevedo, L. C. (2009). EEG photic driving: Right-hemisphere reactivity deficit in childhood autism. A pilot study. International Journal of Psychophysiology, 71, 177–183. LeDoux, J. (2003). The emotional brain, fear, and the amygdala. Cellular and Molecular Neurobiology, 23, 727–738. Lee, D., Simos, P., Sawrie, S. M., Martin, R. C., & Knowlton, R. C. (2005). Dynamic brain activation patterns for face recognition: A magnetoencephalography study. Brain Topography, 18, 19–26. Leung, A. W., & Alain, C. (2011). Working memory load modulates the auditory ‘‘What’’ and ‘‘Where’’ neural networks. NeuroImage, 55, 1260–1269. Liou, M., Savostyanov, A. N., Simak, A. A., Wu, W.-C., Huang, C.-T., & Cheng, P. E. (2012). An information system in the brain: Evidence from fMRI BOLD responses. Chinese Journal of Psychology, 54, 1–26. Liu, Y. N., Cherkassky, V. L., Minshew, N. J., & Just, M. A. (2011). Autonomy of lower-level perception from global processing in autism: Evidence from brain activation and functional connectivity. Neuropsychologia, 49, 2105–2111. McPartland, J., & Klin, A. (2006). Asperger’s syndrome. Adolescent Medicine Clinics, 17, 771–788 abstract xiii. Milyavsky, M., Hassin, R. R., & Schul, Y. (2012). Guess what? Implicit motivation boosts the influence of subliminal information on choice. Consciousness and Cognition, 21, 1232–1241. Molle, M., Marshall, L., Fehm, H. L., & Born, J. (2002). EEG theta synchronization conjoined with alpha desynchronization indicate intentional encoding. European Journal of Neuroscience, 15, 923–928. Mukherjee, P., Whalley, H. C., McKirdy, J. W., McIntosh, A. M., Johnstone, E. C., Lawrie, S. M., et al. (2012). Lower effective connectivity between amygdala and parietal regions in response to fearful faces in schizophrenia. Schizophrenia Research, 134, 118–124. O’Connor, A. R., Han, S., & Dobbins, I. G. (2010). The inferior parietal lobule and recognition memory: Expectancy violation or successful retrieval? Journal of neuroscience, 30, 2924–2934. O’Connor, K., Hamm, J. P., & Kirk, I. J. (2005). The neurophysiological correlates of face processing in adults and children with Asperger’s syndrome. Brain and Cognition, 59, 82–95. O’Connor, K., Hamm, J. P., & Kirk, I. J. (2007). Neurophysiological responses to face, facial regions and objects in adults with Asperger’s syndrome: An ERP investigation. International Journal of Psychophysiology, 63, 283–293. Ogawa, T., Sugiyama, A., Ishiwa, S., Suzuki, M., Ishihara, T., & Sato, K. (1982). Ontogenic development of EEG-asymmetry in early infantile autism. Brain and Development, 4, 439–449. Pare, D. (2003). Role of the basolateral amygdala in memory consolidation. Progress in Neurobiology, 70, 409–420. Pare, D., & Collins, D. R. (2000). Neuronal correlates of fear in the lateral amygdala: Multiple extracellular recordings in conscious cats. Journal of Neuroscience, 20, 2701–2710. Parkin, A. J., & Williamson, P. (1987). Cerebral lateralisation at different stages of facial processing. Cortex, 23, 99–110. Phan, K. L., Wager, T., Taylor, S. F., & Liberzon, I. (2002). Functional neuroanatomy of emotion: A meta-analysis of emotion activation studies in PET and fMRI. NeuroImage, 16, 331–348. Piggot, J., Kwon, H., Mobbs, D., Blasey, C., Lotspeich, L., Menon, V., et al. (2004). Emotional attribution in high-functioning individuals with autistic spectrum disorder: A functional imaging study. Journal of the American Academy of Child and Adolescent Psychiatry, 43, 473–480. Rhodes, G. (1993). Configural coding, expertise, and the right hemisphere advantage for face recognition. Brain and Cognition, 22, 19–41. Ross, P., & Turkewitz, G. (1981). Individual differences in cerebral asymmetries for facial recognition. Cortex, 17, 199–213. Rousseeuw, P. J. (1987). Silhouettes: A graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics, 20, 53–65. Rugg, M. D. (1990). Event-related brain potentials dissociate repetition effects of high-and low-frequency words. Memory & Cognition, 18, 367–379. Sauseng, P., Klimesch, W., Gruber, W. R., & Birbaumer, N. (2008). Cross-frequency phase synchronization: A brain mechanism of memory matching and attention. NeuroImage, 40, 308–317. Schafer, R., Popp, K., Jorgens, S., Lindenberg, R., Franz, M., & Seitz, R. J. (2007). Alexithymia-like disorder in right anterior cingulate infarction. Neurocase, 13, 201–208. Schultz, R. T. (2005). Developmental deficits in social perception in autism: the role of the amygdala and fusiform face area. International Journal of Developmental Neuroscience, 23, 125–141. Seidenbecher, T., Laxmi, T. R., Stork, O., & Pape, H. C. (2003). Amygdalar and hippocampal theta rhythm synchronization during fear memory retrieval. Science, 301, 846–850. Sestieri, C., Corbetta, M., Romani, G. L., & Shulman, G. L. (2011). Episodic memory retrieval, parietal cortex, and the default mode network: Functional and topographic analyses. Journal of Neuroscience, 31, 4407–4420. Smith, C. D., Lori, N. F., Akbudak, E., Sorar, E., Gultepe, E., Shimony, J. S., et al. (2009). MRI diffusion tensor tracking of a new amygdalo-fusiform and hippocampofusiform pathway system in humans. Journal of Magnetic Resonance Imaging, 29, 1248–1261. Solomon, M., Ozonoff, S. J., Ursu, S., Ravizza, S., Cummings, N., Ly, S., et al. (2009). The neural substrates of cognitive control deficits in autism spectrum disorders. Neuropsychologia, 47, 2515–2526. Tabachnick, B. G., & Fidell, L. S. (1996). Using multivariate statistics. New York: HarperCollins College. Tamietto, M., & De Gelder, B. (2010). Neural bases of the non-conscious perception of emotional signals. Nature Reviews Neuroscience, 11, 697–709. Toivonen, M., & Rama, P. (2009). N400 during recognition of voice identity and vocal affect. NeuroReport, 20, 1245–1249. Tsai, A. C., Savostyanov, A. N., Wu, A., Evans, J. P., Chien, V. S., Yang, H.-H., et al. (2013). Recognizing syntactic errors in Chinese and English sentences: Brain electrical activity in Asperger’s syndrome. Research in Autism Spectrum Disorders, 7, 889–905. Tseng, Y.-L., Ko, P.-Y., & Jaw, F.-S. (2012). Detection of the third and fourth heart sounds using Hilbert-Huang transform. Biomedical Engineering Online, 11, 1–13. Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2003). Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nature Neuroscience, 6, 624–631. Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45, 174–194. Whalen, P. J., Rauch, S. L., Etcoff, N. L., McInerney, S. C., Lee, M. B., & Jenike, M. A. (1998). Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. Journal of Neuroscience, 18, 411–418. Williams, L. M., Das, P., Liddell, B. J., Olivieri, G., Peduto, A. S., David, A. S., et al. (2007). Fronto-limbic and autonomic disjunctions to negative emotion distinguish schizophrenia subtypes. Psychiatry Research – Neuroimaging, 155, 29–44. World Health Organization (1994). International Classification of Diseases (Tenth Edition). Switzerland: World Health Organization Publishing. Yang, H. H., Savostyanov, A. N., Tsai, A. C., & Liou, M. (2011). Face recognition in Asperger syndrome: A study on EEG spectral power changes. Neuroscience Letters, 492, 84–88.