Visual working memory representations guide the detection of emotional faces: An ERP study

Visual working memory representations guide the detection of emotional faces: An ERP study

Vision Research 119 (2016) 1–8 Contents lists available at ScienceDirect Vision Research journal homepage: www.elsevier.com/locate/visres Visual wo...

926KB Sizes 0 Downloads 83 Views

Vision Research 119 (2016) 1–8

Contents lists available at ScienceDirect

Vision Research journal homepage: www.elsevier.com/locate/visres

Visual working memory representations guide the detection of emotional faces: An ERP study Lingxia Fan a, Cody Ding a,b, Renlu Guo c, Mengsi Xu a, Liuting Diao a, Dong Yang a,⇑ a

Department of Psychology, Southwest University, Chongqing, China University of Missouri, St. Louis, USA c Institute of Psychology and Behavior, WenZhou University, Wenzhou, China b

a r t i c l e

i n f o

Article history: Received 3 October 2014 Received in revised form 21 August 2015 Accepted 23 August 2015 Available online 18 January 2016 Keywords: Working memory Angry face Visual search N2pc

a b s t r a c t We investigated the correlates of the influences exerted by visual working memory (VWM) on attentional selection of emotional faces using electrophysiological method. Participants performed a search task to detect happy or angry faces among groups of neutral faces while simultaneously keeping in VWM a colour cue presented initially. A visual working memory test was required at last to ensure that the cue had been maintained in VWM. Happy faces elicited a larger amplitude N2pc ERP component when VWM features matched the target face (valid condition) and a smaller amplitude when VWM features matched a distractor face (invalid condition), compared with the neutral condition (wherein VWM features did not match any face in the search array). Additionally, angry faces elicited a greater N2pc amplitude in valid trials than in neutral and invalid trials. Although VWM could guide the attentional deployment of angry and happy faces, the guidance was subject to an anger superiority effect. Ó 2015 Elsevier Ltd. All rights reserved.

1. Introduction Throughout our daily lives, there are so many stimuli around us that only a few of them can be attended to and receive further processing. To perform effectively, it must be possible to direct attention to the most relevant or valuable objects so that they may receive further processing, a process named attentional selection (Awh, Belopolsky, & Theeuwes, 2012; Shomstein, Lee, & Behrmann, 2010). Researchers proposed two mechanisms to explain the process of attentional selection: a stimulus-driven (or bottom-up) mechanism (Buschman & Miller, 2007; Connor, Egeth, & Yantis, 2004), which indicates that attention is attracted automatically by the objects with higher salience in the environment, such as suddenly appearing people or items with a unique colour; and a goal-directed (or top-down) mechanism (Anderson, 2013; Lien, Ruthruff, & Johnston, 2010), which tells that attention is directed voluntarily according to the task goals or internal representations, such as searching for a specific face in a crowd of faces (Theeuwes, 2010). The guidance effect of visual attention by the representations in visual working memory shares properties of top-down control (Awh, Vogel, & Oh, 2006; Huang & Pashler,

⇑ Corresponding author at: School of Psychology, Southwest University, Beibei, Chongqing 400715, China. E-mail address: [email protected] (D. Yang). http://dx.doi.org/10.1016/j.visres.2015.08.021 0042-6989/Ó 2015 Elsevier Ltd. All rights reserved.

2007; Hutchinson & Turk-Browne, 2012; Soto, Heinke, Humphreys, & Blanco, 2005). The biased competition model predicts that features held in VWM are more easily attended to than are features not held therein (Desimone & Duncan, 1995). An increasing body of evidence has shown that representations held in VWM can guide attention, biasing it towards a memory matching items in the process of attentional selection (Carlisle & Woodman, 2011; Chen & Tsou, 2011; Soto, Hodsoll, Rotshtein, & Humphreys, 2008). For instance, Downing (2000) used a dual-task paradigm (memory task and dot-probe task) to investigate the interaction between VWM and selective attention. He found that search performance was better when the target appeared at the same location with items previously maintained in visual working memory. Further, Soto, Humphreys, and Heinke (2006) asked participants to respond to a tilted line target among three vertical lines while memorising a coloured shape at the beginning of each trial; each line in the search display was contained by a coloured shape. They found that performance was impeded when the coloured shape in VWM matched one of the shapes outside the distractor lines, but the search was faster when the content of VWM reappeared the surrounding of target line. All of the research mentioned has taken the viewpoint that VWM contents can guide attentional selection of objects. However, few studies have explored how VWM may guide attention for special stimuli that may have processing advantages, such as

2

L. Fan et al. / Vision Research 119 (2016) 1–8

emotional faces (Grecucci, Soto, Rumiati, Humphreys, & Rotshtein, 2009). The detection advantage of angry face has been demonstrated by many studies using visual search tasks containing facial expressions (Pinkham, Baron, Sasson, & Gur, 2010). For instance, Hansen and Hansen (1988) conducted the first study using photographs with real human faces and reported that people responded faster and more accurately to angry faces than to happy faces, which was termed the ‘anger superiority effect’. However, Hansen and Hansen’s study was limited due to the presence of potential visual confounds, such as the hair style, coloured skin and visibility of teeth (Calvo & Nummenmaa, 2008; Purcell, Stewart, & Skov, 1996). Further studies used computer-drawn schematic faces instead of photographs, which was a reasonable way to eliminate the experimental interference of irrelevant lowlevel perceptual confounds (Öhman, Lundqvist, & Esteves, 2001) However, studies using schematic faces rather than human faces inevitably have lower ecological validity. Some investigations focusing on the use of schematic faces have also considered low-level features rather than emotion as an important factor underlying the face-in-the-crowd effect (Coelho, Cloete, & Wallis, 2011; Kennett & Wallis, 2013). However, additional evidence from behavioural and event-related potential (ERP) experiments was provided to support the effectiveness of schematic faces in eliciting the face-in-the-crowd effect (Ashwin et al., 2012; Dickins & Lipp, 2014; Krombholz, Schaefer, & Boucsein, 2007). Dickins and Lipp (2014) examined low-level visual feature account for anger superiority effect and found that detection advantage disappeared in the display involving only critical elements in schematic faces. These elements consisted of inward tilted brows and an upturned mouth, which suggests that inward pointing lines were not sufficient to explain the detection advantage of angry schematic faces. In addition, relatively early facespecific electrophysiological responses, such as P1, N170, and EPN have been observed to be preattentively modulated by schematic facial expressions, demonstrating the existence of emotion induced by schematic faces (Lyyra, Hietanen, & Astikainen, 2014). From an evolutionary perspective, the preferential processing of angry faces may be automatic and not easily changed (Horstmann, 2007; Öhman, Flykt, & Esteves, 2001). However, some recent studies have demonstrated top-down effects on emotional processing, indicating that the detection advantage of angry face may not be an automatic process (Van Dillen & Derks, 2012; Yao, Ding, Qi, & Yang, 2014). For example, it has been demonstrated that anger superiority effect was eliminated when high reward was given after responding to happy faces and low reward after angry faces (Yao et al., 2014). Therefore, it remains unclear whether this preferential processing of angry faces is automatic and unchanged or subject to top-down modulation. Moriya, Koster, and De Raedt (2014) used photographs to probe the effect of VWM content on the detection advantage of angry face. They asked participants to remember a colour at first and then to detect a different emotional face in the search display involving various coloured facial expressions. Results showed that participants had a tendency to direct attention to the face with its colour matching those previously kept in visual working memory regardless of the facial expression when the probability of matching trials increased. This suggests that contents in VWM can suppress the anger superiority effect through a top-down manner. The anger superiority effect in the experiment was measured by the difference in RTs between angry and happy targets, however, with little consideration of the influence of VWM content on happy and angry faces individually. In addition, behavioural data can be insensitive to rapid attentional shifts. Therefore, the present study investigated the guidance effect of VWM in the processing of emotional information using ERP technology, which has high temporal resolution.

Here, we used schematic faces to prevent perceptual confounds and consider ERP component N2pc as an electropsychological indicator to explore whether and how the attentional selection of emotional faces is influenced or guided by the visual working memory representations. The N2pc is sensitive to the deployment of spatial selective attention (Eimer, 1996; Kiss, Van Velzen, & Eimer, 2008). It is typically elicited at posterior electrodes contralateral to the visually attuned items between 180 and 320 ms after the display of certain stimuli. The component represents enhanced negativity and its amplitude indexes what extent attention resource was occupied by task relevant goals (Luck & Hillyard, 1994; Qi, Zeng, Ding, & Li, 2013). Previous studies using the N2pc have found that a larger N2pc amplitude observed for angry versus happy faces represents the allocation of more attention to the former (Weymar, Loew, Ohman, & Hamm, 2011; Yao et al., 2014). 2. Experiment 1 Experiment 1 used the same paradigm as did previous studies that have examined VWM guidance of attention (Carlisle & Woodman, 2011; Moriya et al., 2014). We tested the guidance effect of VWM in the detection of emotional faces by manipulating the validity of VWM cues and investigated whether the N2pc component, elicited by target faces in particular, varied according to the presence of the VWM cues. We hypothesised that relative to a neutral baseline, search would be more efficient in valid trials and less efficient in invalid trials, regardless of facial emotion, with the N2pc component varying according to the validity of the VWM cues. Furthermore, we hypothesised that larger N2pc elicited by angry face than that by happy face should be found and this effect might be modulated by top-down influence of VWM. 2.1. Methods 2.1.1. Participants Seventeen neurologically unimpaired undergraduate students (seven men and ten women, aged between 18 and 24 years old, mean age = 21.1 years) were paid to participate in the present study. All of the experimental procedures were in accordance with The Code of Ethics of the World Medical Association (Declaration of Helsinki). 2.1.2. Apparatus and stimuli We used coloured squares subtending 0.9° of visual angle as VWM stimuli. Each square was drawn in one of five easily discriminable colours: red, blue, yellow, green, and purple (Carlisle & Woodman, 2011). The colours were matched in lightness and saturation. We used schematic faces as search items to control for idiosyncrasies in the physical features of real faces. The schematic angry, happy and neutral faces were drawn in black colour on a white background (Öhman et al., 2001). All faces were displayed within coloured boxes. The faces subtended 4.0°  4.0° in the visual search task. Stimuli arrays were displayed on a 19-inch Dell monitor (Dell, Inc., Round Rock, Texas) with a 60 Hz refresh rate. Participants responded to the stimuli by pressing one of four designated buttons on a standard keyboard. We controlled stimulus presentation and collected behavioural data using E-prime software (Psychology Software Tools, Inc., Sharpsburg, PA). 2.1.3. Design and procedure The sequence of a single trial for the three possible conditions is depicted in Fig. 1. At the start of each trial, a black fixation cross was shown for 1000 ms in the centre of the screen. Then, a coloured 1.7°  1.7° shape was displayed centrally for 500 ms.

L. Fan et al. / Vision Research 119 (2016) 1–8

3

Fig. 1. Samples of stimulus displays in three conditions in Experiment 1. (A) matching-target trials (valid trials), (B) non-matching trials (neutral trials), (C) matchingdistracter trials (invalid trials).

Participants were required to memorise the colour of the shape for later recall. After an 800 to 1000 ms blank interval, the search array involving four objects consisting of the coloured box and face was presented for 3000 ms, in which only one expression of face was emotional (angry or happy) and the other three were neutral. In half of the trials, the colour to be held in VWM matched the colour of the box surrounding the target face (valid trials) or the distractor face (invalid trials). In the remaining trials, the colour held in VWM was absent of the search array (neutral trials). Participants were instructed to make quick decisions by pressing ‘C’ if the unique emotional face in the search array was happy and ‘V’ if it was angry. Later, a test array was presented for 2000 ms. The probability that the colour of the test items was similar to previously memorised items or not was equal. Participants pressed ‘1’ if the colour was identical and ‘2’ if the colour was different. All conditions were randomized across the experimental session. Formal experiment was allowed after 20 practice trials. There was a total of 600 experimental trials containing 10 blocks of 60 trials. The validity of the VWM cues (valid, neutral, invalid) and target face expression (angry, happy) were manipulated factorially within participants. In valid trials, the colour held in VWM reappeared in the surrounding of the target face (Fig. 1A). In invalid trials, the colour held in VWM matched one of the colours of the boxes containing distractor faces (Fig. 1B). An additional neutral condition was where no colour in the search array matched the VWM cue (Fig. 1C). 2.1.4. ERP recording and analysis Brain electrical activity was recorded at 64 scalp sites using tin electrodes mounted in an elastic cap (Brain Products, Munich, Germany). Left and right mastoid was regarded as reference and a ground electrode was on the medial frontal aspect. The vertical electrooculogram (EOG) was recorded supra- and infra-orbitally at the right eye. The horizontal eye movement was monitored by an electrode in the left of the right orbital rim. Impedance for all electrodes was below 5 kO. The electroencephalogram (EEG) and EOG were amplified using a 0.05–100 Hz bandpass, and continuously digitized at 500 Hz/channel. The bandpass was filtered with low and high cutoffs of .1 and 30 HZ. Eye-movement artefacts

(blinking and eye movements) were identified offline. The ERP data from incorrect trials and those with artefacts and excessive eye movements were rejected. The averaged analysis epoch was 500 ms, EEG signals were segmented from 100 ms pre-stimulus and 400 ms post-stimulus. General averages were calculated for each experimental condition. The ipsilateral waveform was computed as the average responses of the left- and right-sided electrodes to the left- and right-sided targets, respectively, whereas the contralateral waveform was computed as the average responses of the left- and right-sided electrodes to the right- and left-sided targets, respectively. The analyses of N2pc amplitude focused mainly on lateral posterior electrodes PO7 and PO8 (Brisson, Robitaille, & Jolicœur, 2007; McCollough, Machizawa, & Vogel, 2007). The N2pc was quantified as the maximal negative amplitude observed in the time window of 240 to 320 ms. Greenhouse-Geisser adjustments to the degrees of freedom were used for all statistical analyses where appropriate. 2.2. Results 2.2.1. Behavioural results Only correct responses were included in the analysis, and trials with RTs exceeding three SDs from the overall mean for each participant (<3% of trials) were rejected. The mean RTs were subjected to a 2 (emotional face: angry, happy)  3 (cue validity: valid, neutral, invalid) repeated-measures ANOVA. The main effect of face type was significant (F(1, 16) = 22.67, p < .01, partial g2 = .59). Detection of happy faces (1241 ms) took longer than did detection of angry faces (1164 ms), indicating the existence of an anger superiority effect. A significant main effect was also observed for the validity of VWM cues (F(2, 15) = 11.95, p < .01, partial g2 = .62). Compared with neutral conditions (1195 ms), RTs in valid conditions (1155 ms) were faster and RTs in invalid conditions (1258 ms) were slower. Further, the interaction between face type and validity of VWM also reached significant level (F(2, 15) = 4.41, p < .05, partial g2 = .37). Simple effects tests showed that RTs in invalid VWM trials were significantly longer than those in neutral trials, regardless of whether the target face was angry or happy (p < .05). However, reactions in valid trials were faster (1175 ms)

4

L. Fan et al. / Vision Research 119 (2016) 1–8

than that in neutral trials (1245 ms) only for happy faces (p < .05; Fig. 2). In addition, RTs for detection of happy faces was notably shorter than that for the detection of angry faces in all the three VWM cue conditions, indicating the existence of an anger superiority effect for all VWM cues. We also calculated the anger superiority effect in valid, neutral, and invalid trials, respectively, by subtracting RTs for an angry target from RTs for a happy target. Then, we compared the effect under the three cue conditions. The anger superiority effect in valid conditions (40.95 ms) was significantly smaller than that in neutral trials (100.43 ms; t(16) = 2.78, p = .01). The effect in invalid conditions (90.09 ms) was also smaller than that in neutral conditions, but this difference only approached significance (t(16) = 1.84, p = .08). 2.2.2. ERP results Fig. 3 reveals the ERPs triggered by happy and angry faces in all conditions at the PO7 and PO8 electrodes. The ERP amplitudes were analysed via a 2 (laterality: contralateral to target, ipsilateral to target)  2 (face type: happy, angry)  3 (validity of VWM cue: valid, neutral, invalid) repeated-measures ANOVA for the 240- to 320ms (late N2pc) analysis window. A significant main effect of laterality was found (F(1, 16) = 24.74, p < .001, partial g2 = .61), reflecting the emergence of the N2pc. A reliable interaction effect between laterality and face type was found (F(1, 16) = 10.87, p < .01, partial g2 = .40), demonstrating that angry faces in visual search elicited larger N2pc than happy faces. Simple effects analysis further confirmed the existence of N2pc amplitude (Md = Mcontra Mipsi) in response to angry faces (Md = 0.992 lV; p < .001) and happy faces (Md = 0.393 lV; p < .01). Importantly, there was also a significant interaction between laterality and validity of the VWM cue (F(2, 15) = 4.19, p = .036, partial g2 = .35), indicating that the amplitude of the N2pc varied according to cue validity. Simple effects tests revealed that the N2pc elicited by angry faces was observed in neutral and valid conditions, as indicated by significantly more negative signals at contralateral electrode sites relative to ipsilateral sites in the 240 to 320 ms interval. However, in invalid cue conditions, the N2pc amplitude difference elicited by angry faces between contralateral and ipsilateral hemisphere electrode sites only approached significance (p = .055). In addition, when the target was a happy face, the N2pc to the target was more negative at contralateral electrode sites than at ipsilateral sites in neutral and valid conditions, but not in invalid conditions. Also, we analysed N2pc amplitude (Md = Mcontra Mipsi) using a two factor repeated-measures ANOVA between cue validity

and face type. We found a reliable main effect for face type (F (1, 16) = 11.89, p = .003, partial g2 = .43), reflecting a stronger N2pc component for angry faces. Crucially, the interaction between the factors was not significant, suggesting that similar N2pc amplitude differences were triggered by angry and happy faces, regardless of cue validity. 2.3. Discussion There are two main conclusions we can obtain from Experiment 1. First, we found a guidance effect of VWM on visual search for angry and happy faces, respectively. Emotional target faces regardless of angry or happy attracted more attentional resources when the colour in VWM matched the colour outside the target face, as indicated by larger N2pc amplitudes. When the colour in VWM matched the colour outside the distractor face, the matched colour cue impeded visual search for happy faces but not for angry faces. The detection of angry faces was seemingly influenced by VWM content in the behavioural data. However, similar N2pc amplitudes in the invalid and neutral conditions demonstrated the same degree of attentional allocation to this threatening face. Second, and inconsistent with our expectations, we did not find the anger superiority effect to be eliminated under the guidance of VWM, as indicated by the larger N2pc potential induced by angry faces than by happy faces in all three cue conditions. 3. Experiment 2: control experiment Various different memory systems could guide attention, for instance, VWM and priming (Hutchinson & Turk-Browne, 2012). Thus, can the results of the above experiment be completely attributed to the effect of visual working memory representation? Perhaps previously appeared items without memorising also influence the subsequent attention processing in visual search. Thus, priming and VWM effects may be confounded in Experiment 1. Previous researches considered it necessary to distinguish the effects of VWM and priming on attention selection (Downing, 2000). Therefore, we conducted Experiment 2 as a control; in this experiment, participants only need to be aware of the colour box presented as VWM cues in experiment 1, with no memorisation task enforced. 3.1. Method 3.1.1. Participants Sixteen new participants (seven men and ten women, aged between 18 and 24 years old, mean age = 21.1 years) were recruited in this experiment. Certain monetary reward was paid after the experiment. All of the experimental procedures were in accordance with The Code of Ethics of the World Medical Association (Declaration of Helsinki).

Fig. 2. Mean RTs for the detection of angry and happy faces in invalid, neutral, and valid trials in Experiment 1. Error bars represent standard errors.

3.1.2. Procedure The procedure of this experiment was similar with that of Experiment 1 except that participants only need to be aware of the cue boxes rather than remember them when presented at the start of each trial. These items were presented for 500 ms. Then, after the appearance of a fixation cross for 800 to 1000 ms, the search array displayed for 3000 ms. Participants made a quick response to the unique emotional face by pressing ‘C’ for the search of happy faces and ‘V’ for angry faces. A trial ended before a 2000 ms blank interval, and then the next trial began. Twenty practice trials were completed before commencing the experiment proper. Each condition consisted of 100 trials, with the entire experiment consisting of 600 trials per participant.

L. Fan et al. / Vision Research 119 (2016) 1–8

5

Fig. 3. Grand-averaged ERPs at posterior electrodes PO7/8 contralateral and ipsilateral to the location of the target in three experimental conditions. (A) N2pc elicited by angry faces in invalid trials. (B) N2pc elicited by angry faces in neutral trials. (C) N2pc elicited by angry faces in valid trials. (D) N2pc elicited by happy faces in invalid trials. (E) N2pc elicited by happy faces in neutral trials. (F) N2pc elicited by happy faces in valid trials.

3.2. Results 3.2.1. Behavioural results During analysis, error trials and RTs more than ± 3 SDs from each subjects’ overall mean RT (<2.5% of trials) were discarded. A repeated-measures ANOVA was conducted on RTs with factors of validity of priming cue (invalid, neutral, valid)  face type (angry, happy). A reliable main effect of face type was found (F(1, 15) = 40.48, p < .01, partial g2 = .73). RTs for the detection of angry faces (1054 ms) were significantly faster than were those for happy faces (1139 ms, p < .001), indicating the appearance of an anger superiority effect (Fig. 4). No other effects were significant, indicating that detection of happy and angry faces was relatively independent of the validity of cues. Furthermore, the anger superiority effect in valid, neutral, and invalid trials was scaled in each case by subtracting RTs for angry targets from RTs for happy targets. Then, we compared the effect under the three cue conditions. There was no significant difference in anger superiority effect between valid (73.84 ms), invalid (82.12 ms), and neutral (93.22 ms) conditions, suggesting that there was no guidance effect of priming cues on the search advantage of angry faces. 3.2.2. ERP results Fig. 5 reveals the ERPs triggered by happy and angry faces in all perceptual cue conditions at the PO7 and PO8 electrodes. A repeated-measures ANOVA was performed on N2pc values with factors of laterality (contralateral to the targets, ipsilateral to the targets), face type (angry, happy), and validity of perceptual cues (valid, neutral, invalid). There was a reliable main effect of laterality (F(1, 16) = 8.03, p < .05, partial g2 = .33), indicating the presence of the N2pc amplitude induced by search targets. The interaction effect between laterality and face type was found significant (F (1, 16) = 6.27, p < .05, partial g2 = .28), demonstrating that angry

Fig. 4. Mean RTs for the detection of angry and happy faces in invalid, neutral, and valid trials in Experiment 2. Error bars represent standard errors.

faces in visual search elicited larger N2pc than happy faces. No other significant effects were found. A repeated-measures ANOVA on N2pc amplitude (Md = Mcontra Mipsi) with cue validity and face type as factors was conducted. We only found a significant main effect of face type (F (1, 16) = 6.27, p < .05, partial g2 = .28). 3.3. Discussion We can see from this experiment that the search efficiency for angry and happy target faces remained stable across the three perceptual cue conditions, as did the anger superiority effect.

6

L. Fan et al. / Vision Research 119 (2016) 1–8

Fig. 5. Grand-averaged ERPs at posterior electrodes PO7/8 contralateral and ipsilateral to the location of the target in three experimental conditions. (A1) N2pc elicited by angry faces in invalid cue trials. (B1) N2pc elicited by angry faces in neutral cue trials. (C1) N2pc elicited by angry faces in valid cue trials. (D1) N2pc elicited by happy faces in invalid cue trials. (E1) N2pc elicited by happy faces in neutral cue trials. (F1) N2pc elicited by happy faces in valid cue trials.

N2pc elicited by angry faces was always larger than that elicited by happy faces, and the amplitude of N2pc triggered by face targets was not modulated by the validity of perceptual cues. Therefore, these results support the findings from Experiment 1: the guidance effect of VWM on search for emotional faces cannot be explained by mere exposure to the cues.

4. General discussion In our experiment, we used ERPs to investigate the guidance effect of VWM on the detection of emotional faces and on the anger superiority effect by manipulating the validity of VWM cues. Behavioural and ERP results revealed a guidance effect of VWM on attention allocation for both angry and happy faces, although there was a processing advantage for angry faces. We also found that VWM representations modulated the anger superiority effect to a certain extent. The results in the control experiment ruled out the possibility that priming effect was sufficient to direct attention toward faces associated with that colour (Downing, 2000). As such, the effects observed in our experiment can be attributed to VWM representations rather than perceptual priming. Therefore, our experiment provides evidence that the detection of emotional faces is susceptible to VWM effects, but VWM representations are not powerful enough to modify the attentional advantage to angry face. For the detection of happy faces, performance was largely impeded when the colour in VWM matched the colour outside one of distractor faces (invalid condition) and facilitated when the colour outside target face was similar to the colour of an object in VWM (valid condition). The ERPs were in accordance with the behavioural results, as reflected by N2pc amplitudes in three conditions. The N2pc component elicited by happy faces was found when the VWM colour did not exist in the search arrays (neutral condition), indicating attention allocation to searching for target

faces. When the VWM colour matched the colour of the box surrounding target faces (valid condition), the N2pc amplitude was larger, suggesting that increased attention resources were deployed toward target items. However, the N2pc component elicited by happy faces disappeared when the VWM cue colour matched one of the distractor items, which can be explained by the reason that colour similar with those in VWM captured attention early in search processing. This is consistent with a previous study, which found stronger N2pc amplitudes induced by target when the VWM items reappeared on the same side as the target and decreased N2pc amplitudes when the items reappeared on the contralateral side (Telling, Kumar, Meyer, & Humphreys, 2010). The modulation effects of VWM on the N2pc amplitude further provide proof for the guiding ability of VWM in detecting happy faces. For the detection of angry faces, however, the behavioural results showed some discrepancies with respect to the ERP data. As may be seen from the RTs, responses to angry faces were slower when the VWM content was associated with the distractor faces (invalid condition) as compared with the neutral condition. Nevertheless, we still observed high N2pc amplitudes elicited by angry faces in invalid trials, similar to those observed in neutral trials. That is, the N2pc component did not reflect the slowing of responses in the invalid condition. We assumed that although the reappearance of VWM content as a distractor did interrupt the search for target faces, angry faces attracted attention automatically (Maratos, Mogg, & Bradley, 2008). Interestingly, RTs to angry faces in valid trials were similar with that in neutral trials. However, a larger N2pc amplitude for angry faces in valid cue condition suggested that angry face target attracts more attention in visual search. A previous study that used the N2pc to investigate the VWM guidance effect also found that stronger N2pc was observed when an object in VWM reappeared in the same visual field with the target, and a smaller N2pc amplitude when the object in VWM reappeared contralateral to the target (Kumar, Soto, &

L. Fan et al. / Vision Research 119 (2016) 1–8

Humphreys, 2009). A possible reason for this is that both angry faces and matched VWM features captured attention, meaning that more attentional resources would be allocated to their conjunction. Regarding the anger superiority effect, VWM content did modulate responses to angry and happy faces, but did not overcome the anger superiority effect. The anger superiority effect remained in valid, neutral, and invalid conditions. However, the effect varied among the three conditions. In the valid condition, the anger superiority effect decreased as compared with invalid and neutral conditions. These results suggested that the processing advantage of angry faces was powerful and not easy to overcome. However, the decrease in the anger superiority effect in valid trials provided us with evidence that the effect might be subject to top-down influence via VWM. Moriya et al. (2014) obtained the same result, namely that VWM guidance is inefficient in eliminating the anger superiority effect unless the top-down influence of VWM is enhanced by increasing the probability of valid trials. Biased competition theory proposes that maintaining a representation in working memory brings about increased activity of the cells in the visual system that code for the features of the template (Desimone & Duncan, 1995). Objects in search display compete for the limited attentional resources when processing. What extent an object be attended largely depend on the domination of top-down biasing mechanisms which increases the probability that stimuli matching the content of VWM could win among several neural presentations (Moriya et al., 2014). Therefore, more attentional resources were allocated to happy and angry faces in matching-target cue trials. In a matching-distractor cue condition, the target face and memory-matching features compete for limited attentional resources, leading to decreased selective attention available for the target face. However, threatening stimuli, such as angry faces, activate neural representations in a bottom-up fashion (Lindquist, Wager, Kober, et al., 2012). The VWM effect did not overcome this bottom-up emotional processing (Moriya et al., 2014). Thus, angry faces captured attention even when VWM features reappeared in consort with a distractor, as indicated by the similar N2pc amplitudes in neutral and invalid cue conditions. Previous studies provide evidence that contents of VWM automatically guide the process of attentional selection (Soto et al., 2005, 2008). In our study, holding a colour in VWM produced a guidance effect for the search of happy and angry faces, even when the colour was irrelevant to the search task, consistent with the powerful and automatic effect of VWM guidance. Therefore, the present study extends previous studies concerned with the VWM effect (Moriya et al., 2014) by verifying the powerful guidance effect of VWM on even complex stimuli, by using high temporal resolution ERP technology. We also suggested that there exists an interaction effect between top-down control from visual working memory and bottom-up process for threatening faces. Some limitations to this study should be acknowledged. First, previous studies have demonstrated that contralateral delay activity (CDA) was an effective measure to indicate the situation of the VWM representations used to guide attention (Woodman, Carlisle, & Reinhart, 2013). However, we did not collect a CDA indicator. Further research using this indicator to ensure that cue objects are maintained in VWM is of interest in investigating the VWM guidance effect for emotional faces. Second, one study has shown that the anger superiority effect can be suppressed by enhanced top-down VWM control (Moriya et al., 2014). Further research might usefully investigate the electrical evidence of this phenomenon using the N2pc indicator. Third, although our results showed that the reappearance of VWM features captured attention in a search process, it is unclear whether attention is directed first toward the memory-matching feature or to the target object when the search items initially appear. This specific competition process for attentional resources needs further exploration.

7

5. Conclusion The present study demonstrates that the content of VWM can guide the detection of angry or happy faces, as indicated by N2pc amplitudes. However, the anger superiority effect is not eliminated under the guidance of VWM representations. Findings from the present study indicate that top-down guidance of attention from VWM is effective when processing complex items that convey social information (i.e., faces), but is not powerful enough to modify the anger superiority effect. Acknowledgments This work was supported by National Natural Science Foundation of China (71472156) and the Humanities and Social Science Research Funds of Ministry of Education (11XJC190003). References Anderson, B. A. (2013). A value-driven mechanism of attentional selection. Journal of Vision, 13(3), 1–16. Ashwin, C., Holas, P., Broadhurst, S., Kokoszka, A., Georgiou, G. A., & Fox, E. (2012). Enhanced anger superiority effect in generalized anxiety disorder and panic disorder. Journal of Anxiety Disorders, 26(2), 329–336. Awh, E., Belopolsky, A. V., & Theeuwes, J. (2012). Top-down versus bottom-up attentional control: A failed theoretical dichotomy. Trends of Cognitive Science, 16(8), 437–443. Awh, E., Vogel, E. K., & Oh, S. H. (2006). Interactions between attention and working memory. Neuroscience, 139(1), 201–208. Brisson, B., Robitaille, N., & Jolicœur, P. (2007). Stimulus intensity affects the latency but not the amplitude of the N2pc. NeuroReport, 18(15), 1627–1630. Buschman, T. J., & Miller, E. K. (2007). Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science, 315(5820), 1860–1862. Calvo, M. G., & Nummenmaa, L. (2008). Detection of emotional faces: Salient physical features guide effective visual search. Journal of Experimental Psychology: General, 137(3), 471–494. Carlisle, N. B., & Woodman, G. F. (2011). Automatic and strategic effects in the guidance of attention by working memory representations. Acta Psychologica (Amst), 137(2), 217–225. Chen, Z., & Tsou, B. H. (2011). Task-based working memory guidance of visual attention. Attention Perception & Psychophysics, 73(4), 1082–1095. Coelho, C. M., Cloete, S., & Wallis, G. (2011). The face-in-the-crowd effect: When angry faces are just cross(es). Journal of Vision, 10(1), 7.1–7.14. Connor, C. E., Egeth, H. E., & Yantis, S. (2004). Visual attention: Bottom-up versus top-down. Current Biology, 14(19), R850–852. Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18(1), 193–222. Dickins, D. S., & Lipp, O. V. (2014). Visual search for schematic emotional faces: Angry faces are more than crosses. Cognition and Emotion, 28(1), 98–114. Downing, P. E. (2000). Interactions between visual working memory and selective attention. Psychological Science, 11(6), 467–473. Eimer, M. (1996). The N2pc component as an indicator of attentional selectivity. Electroencephalography and Clinical Neurophysiology, 99(3), 225–234. Grecucci, A., Soto, D., Rumiati, R. I., Humphreys, G. W., & Rotshtein, P. (2009). The interrelations between verbal working memory and visual selection of emotional faces. Journal of Cognitive Neuroscience, 22(6), 1189–1200. Hansen, C. H., & Hansen, R. D. (1988). Finding the face in the crowd: An anger superiority effect. Journal of Personality and Social Psychology, 54(6), 917–924. Horstmann, G. (2007). Preattentive face processing: What do visual search experiments with schematic faces tell us? Visual Cognition, 15(7), 799–833. Huang, L., & Pashler, H. (2007). Working memory and the guidance of visual attention: Consonance-driven orienting. Psychonomic Bulletin & Review, 14(1), 148–153. Hutchinson, J. B., & Turk-Browne, N. B. (2012). Memory-guided attention: Control from multiple memory systems. Trends in Cognitive Sciences, 16(12), 576–579. Kennett, M., & Wallis, G. (2013). Investigating low-level explanations for the angry schematic-face search advantage. Journal of Vision, 13(9). 304–304. Kiss, M., Van Velzen, J., & Eimer, M. (2008). The N2pc component and its links to attention shifts and spatially selective visual processing. Psychophysiology, 45 (2), 240–249. Krombholz, A., Schaefer, F., & Boucsein, W. (2007). Modification of N170 by different emotional expression of schematic faces. Biological Psychology, 76(3), 156–162. Kumar, S., Soto, D., & Humphreys, G. W. (2009). Electrophysiological evidence for attentional guidance by the contents of working memory. European Journal of Neuroscience, 30(2), 307–317. Lien, M. C., Ruthruff, E., & Johnston, J. C. (2010). Attentional capture with rapidly changing attentional control settings. Journal of Experimental Psychology: Human Perception Performance, 36(1), 1–16.

8

L. Fan et al. / Vision Research 119 (2016) 1–8

Lindquist, K. A., Wager, T. D., Kober, H., et al. (2012). The brain basis of emotion: A meta-analytic review. Behavioral and Brain Sciences, 35(03), 121–143. Luck, S. J., & Hillyard, S. A. (1994). Electrophysiological correlates of feature analysis during visual search. Psychophysiology, 31(3), 291–308. Lyyra, P., Hietanen, J. K., & Astikainen, P. (2014). Anger superiority effect for change detection and change blindness. Consciousness and Cognition, 30(2014), 1–12. Maratos, F. A., Mogg, K., & Bradley, B. P. (2008). Identification of angry faces in the attentional blink. Cognition and Emotion, 22(7), 1340–1352. McCollough, A. W., Machizawa, M. G., & Vogel, E. K. (2007). Electrophysiological measures of maintaining representations in visual working memory. Cortex, 43 (1), 77–94. Moriya, J., Koster, E. H. W., & De Raedt, R. (2014). The influence of working memory on the anger superiority effect. Cognition and Emotion, 28(8), 1449–1464. Öhman, A., Flykt, A., & Esteves, F. (2001). Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology: General, 130(3), 466–478. Öhman, A., Lundqvist, D., & Esteves, F. (2001). The face in the crowd revisited: A threat advantage with schematic stimuli. Journal of Personality and Social Psychology, 80(3), 381–396. Pinkham, M. G., Baron, R., Sasson, N. J., & Gur, R. C. (2010). The face in the crowd effect: Anger superiority when using real faces and multiple identities. Emotion, 10(1), 141–146. Purcell, D. G., Stewart, A. L., & Skov, R. B. (1996). It takes a confounded face to pop out of a crowd. Perception, 25, 1091–1120. Qi, S., Zeng, Q., Ding, C., & Li, H. (2013). Neural correlates of reward-driven attentional capture in visual search. Brain Research, 1532, 32–43.

Shomstein, S., Lee, J., & Behrmann, M. (2010). Top-down and bottom-up attentional guidance: Investigating the role of the dorsal and ventral parietal cortices. Experimental Brain Research, 206(2), 197–208. Soto, D., Heinke, D., Humphreys, G. W., & Blanco, M. J. (2005). Early, involuntary topdown guidance of attention from working memory. Journal of Experimental Psychology: Human Perception Performance, 31(2), 248–261. Soto, D., Hodsoll, J., Rotshtein, P., & Humphreys, G. W. (2008). Automatic guidance of attention from working memory. Trends of Cognitive Science, 12(9), 342–348. Soto, D., Humphreys, G. W., & Heinke, D. (2006). Working memory can guide popout search. Vision Research, 46(6–7), 1010–1018. Telling, A. L., Kumar, S., Meyer, A. S., & Humphreys, G. W. (2010). Electrophysiological evidence of semantic interference in visual search. Journal of Cognitive Neuroscience, 22(10), 2212–2225. Theeuwes, J. (2010). Top-down and bottom-up control of visual selection. Acta Psychologica, 135(2), 77–99. Van Dillen, L. F., & Derks, B. (2012). Working memory load reduces facilitated processing of threatening faces: An ERP study. Emotion, 12(6), 1340–1349. Weymar, M., Loew, A., Ohman, A., & Hamm, A. O. (2011). The face is more than its parts – Brain dynamics of enhanced spatial attention to schematic threat. Neuroimage, 58(3), 946–954. Woodman, G. F., Carlisle, N. B., & Reinhart, R. M. (2013). Where do we store the memory representations that guide attention? Journal of Vision, 13(3), 1–17. Yao, S., Ding, C., Qi, S., & Yang, D. (2014). Value associations of emotional faces can modify the anger superiority effect: Behavioral and electrophysiological evidence. Social Cognitive and Affective Neuroscience, 9(6), 849–856.