ARTICLE IN PRESS
Effect of emotions on temporal attention Maruti V. Mishra1, Sonia B. Ray, Narayanan Srinivasan Centre of Behavioural and Cognitive Sciences, University of Allahabad, Allahabad, Uttar Pradesh, India 1 Corresponding author: Tel.: +91-532-2460738, e-mail address:
[email protected];
[email protected]
Abstract Emotions play a significant role in guiding everyday actions and strongly interact with attention. The processing of emotional information over time and the influence of attention on such processing has been studied through the phenomenon of attentional blink using rapid serial visual presentations (RSVP) tasks. This chapter discusses the interaction between temporal attention and the type of emotional information (words, scenes, and facial expressions) presented during or before the RSVP stream. The findings show that the affective content and the arousal value of the emotional stimuli presented as first target, second target, or both affects the magnitude and the duration of the blink window. In addition, modulation of emotional context or presentation of emotions in the RSVP stream as task irrelevant distractors also influenced attentional blink. Further, this chapter discusses different models and theories of attentional blink and attempts to explain the emotional effects. The chapter concludes with possible scope for future studies.
Keywords Attentional blink, Emotion, Capacity limitations, Emotion-induced blindness, Attentional control, Perceptual episodes, Valence, Arousal
1 INTRODUCTION As we open our eyes, it appears that a smooth and coherent visual world is readily available to us without much effort. However, behind the apparent ease of perception lies the dynamic interaction of cognitive processes that integrates information over time giving rise to a coherent percept. The integration of information over the first half second is influenced by selective attention, cognitive control as well as the na€ ture of the perceptual content (Breitmeyer and Oğmen, 2006; Dux and Marois, 2009; Martens and Wyble, 2010). Selective attention has been conceptualized to involve Progress in Brain Research, ISSN 0079-6123, https://doi.org/10.1016/bs.pbr.2017.08.014 © 2017 Elsevier B.V. All rights reserved.
287
ARTICLE IN PRESS 288
Emotion and temporal attention
selection of relevant information and/or ignoring the irrelevant (Wright and Ward, 2008) and has been studied in terms of selection over space and time (Nobre and Coull, 2013; Posner, 1980; Raymond et al., 1992). Attentional selection over time has been extensively studied using multiple methods including rapid serial visual presentation tasks (RSVP). As the name suggests, the RSVP paradigms typically involve presentation of multiple stimuli consisting of either a single, dual, or multiple targets embedded among distractors in rapid succession for a brief time (15–100 ms), one after the other at the same spatial location (Broadbent and Broadbent, 1987; Eriksen and Spencer, 1969; Raymond et al., 1992; Reeves and Sperling, 1986). Initial studies involving reporting of two targets in RSVP design (15 ms or 6–20 items/s) reported a reduction in the ability to identify the second target (T2), following correct identification of the first target (T1) when presented in close temporal proximity (100–400 ms, Lag 2–4) to the first target (Broadbent and Broadbent, 1987). This reduction in the performance of T2 has been termed as “attentional blink” (AB). Moreover, no reduction in T2 performance was observed when T2 immediately succeeds T1, without any distractors, which was termed as “Lag 1 sparing.” This paradigm and its variants have also been used to study the time course of information processing in terms of attentional dwell time (Duncan et al., 1994) as well as in investigating the attentional resources needed for a target to reach awareness (Anderson, 2005; Martens and Wyble, 2010). The reduction in T2 performance following identification of T1 has been argued to be due to nonavailability of attentional resources (Raymond et al., 1992) or capacity limitations (Chun and Potter, 1995). Initial explanations were based on gating theory (inhibition model) wherein presentation of T1 triggers an attentional window, and to inhibit feature confusions from upcoming targets a suppressive mechanism is initiated until T1 is identified (Raymond et al., 1992). But later theories argued that when the features are similar even stronger AB is observed (Chun and Potter, 1995). Following this an influential two-stage central capacity-limited model (bottleneck theory) of AB was proposed (Chun and Potter, 1995). According to this theory, there are two stages of processing. The first stage corresponds to the perceptual/semantic categorization (temporary and fragile stage), and the second stage is a capacitylimited consolidation stage in working memory. During RSVP task, all the targets are encoded and processed up to stage 1. It is during stage 2 or consolidation of T1 that attentional resources are unavailable for T2 identification. The duration required for T1 consolidation is suggested to be around 200–500 ms. If T2 appears within this duration, it would not be consolidated and hence cannot be consciously reported or represented. Though, highly influential, the capacity-limitation model failed to explain various novel observations in AB like influence of task instructions (Jolicoeur, 1998), differences in whole vs partial reports of RSVP stream (Nieuwenstein and Potter, 2006; Potter et al., 2008), the influence of various types of distractors following T1 and T2, as well as effects of cueing of the temporal stream or of the targets (Nieuwenhuis et al., 2005; Olivers and Meeter, 2008). Another prominent observation in AB task was the phenomenon of backward blink (de Jong and Martens, 2007). Backward blink (also referred as T1 cost) is where identification of T2 leads to
ARTICLE IN PRESS 2 Emotions and AB
reduction in performance of T1. While backward blink has not been emphasized in the literature, few recent studies have started highlighting its role in postresponserelated effects during reporting (Nı´ Choisdealbha et al., 2017; Sch€onenberg and Abdelrahman, 2013). In order to account for such differences, various models proposed modifications of the central capacity limitation-based approaches (Chun and Potter, 1995; for review, see Dux and Marois, 2009; Martens and Wyble, 2010), while others argued for the role of attentional episodes, cognitive control (Wyble et al., 2009), or perceptual episodes (see chapter “Perceptual episodes, temporal attention, and the role of cognitive control: Lessons from the attentional blink” by Snir and Yeshurun). In this chapter, we primarily focus on the effects of emotional content on AB and their implications for theories of AB.
2 EMOTIONS AND AB Emotions play an important role in guiding social communication and behavior. In general, emotions are elicited by an external (environment) or an internal state, which influences our behavior by modulating decisions and actions. Literature suggests that emotional expressions and emotional scenes form a special class of visual stimuli that are prioritized for perceptual processing than nonemotional stimuli, and subsequently influence various cognitive processes like perception, memory, and decision-making (Adolphs, 2002; Brosch et al., 2013; Engelmann and Pessoa, € 2014; Ohman et al., 2001). Such influences of emotion are known to occur even for briefly presented emotional stimuli (Bocanegra and Zeelenberg, 2011; Phelps et al., 2006). Currently two prominent theories guide emotion–cognition research, categorical, and dimensional accounts (Ekman et al., 1972; Posner et al., 2005; Young et al., 1997). The categorical account emphasizes that basic emotions are universal and fall into discrete categorical boundaries like happy, sad, angry, fear, surprise, disgust, and neutral, while the dimensional account suggests that all the emotion categories are characterized by two fundamental dimensions: valence and activation (arousal) that constitute emotional spaces. Valence refers to the degree of pleasantness or unpleasantness (positive to negative), while activation (or arousal) refers to the intensity of the affect (calm, intense). Studies investigating the effects of emotional content on AB have been conducted using different kinds of emotional stimuli or contexts (Anderson, 2005; McHugo et al., 2013; Olivers and Nieuwenhuis, 2006). In these studies, emotional stimuli used are emotionally loaded words (example, “happy,” “angry,” “flowers,” “snake”), scenes containing emotional content (children playing, gunshot wound, etc.), or schematic and real faces depicting different emotion expressions. Moreover, the valence and arousal component of emotion are shown to play distinct roles during AB. The dependence of AB on valence and/or arousal has been dichotomized in support of two alternative theories of emotion processing, namely, “anger superiority” and “happy superiority” effects. The former theory emphasizes that negative valence (angry, fear, and sad) stimuli are processed faster than positive valence (happy)
289
ARTICLE IN PRESS 290
Emotion and temporal attention
€ stimuli (Hansen and Hansen, 1988; Horstmann and Bauland, 2006; Ohman et al., 2001). On the other hand the latter theory proposes just the opposite, i.e., positive valence stimuli are processed faster than negative valence stimuli (Becker et al., 2011; Juth et al., 2005; Mack et al., 2002). Another important aspect has been the reciprocal interactions between scope of attention (width of attentional window) and emotion processing. Many studies suggest that positive emotions broaden the scope of attention and negative emotions narrow it down; while broader spatial scope of attention (distributed attention) leads to better identification of positive emotions, narrow spatial scope of attention (focused attention) leads to better identification of negative stimuli (Johnson et al., 2010; Srinivasan and Hanif, 2010; Srivastava and Srinivasan, 2010). The subsequent sections focus on the role of valence and arousal of emotional content and their influence on AB. The results are presented in terms of the location of the targets with emotional information in the RSVP stream, i.e., (a) when only second target (T2) is emotional, (b) when only the first target (T1) is emotional, (c) when both the targets (T1 and T2) are emotional. We also discuss those studies (d) wherein emotional context is manipulated with nonemotional targets in a RSVP task.
2.1 BLINK WITH EMOTIONAL T2 Initial studies investigating the role of emotional content on AB using RSVP tasks mostly used negative emotional words (Anderson, 2005; Anderson and Phelps, 2001; Keil and Ihssen, 2004; Ogawa and Suzuki, 2004). Here, a high arousal negative emotional word as T2 (distractors and T1 being neutral words) showed significant enhancement in T2 detection or attenuation (reduction) in blink magnitude at shorter T1–T2 lags (Lag 1–130 ms to Lag 4–400 ms) in comparison to neutral words as T2 (Anderson and Phelps, 2001). The performance for both emotional and neutral T2 was comparable at later lags. The results showed that negative emotions could attenuate the blink. Later, another comprehensive study investigated the effect of emotions by varying multiple aspects of emotions like valence, arousal, and distinctiveness (Anderson, 2005). The experiments reported that not only negative words (angry, fear) but also positive words (happy) lead to attenuation of blink magnitude. This reduction was highly dependent on the arousal value of the emotional stimulus; highly arousing T2 showed less AB compared to low arousing or neutral T2. There was also a significant reduction in T1 performance at earlier lags (backward blink) when the T2 was high arousal negative words. Lag 1 sparing was also observed for neutral and low arousal T2 words, but only when the features of T1 and T2 matched (green color targets). When this feature was altered by changing the color of T1 stimulus and making it a monosyllable target (e.g., “NNNNNN”/“OOOOO,” Exp 2), there was no Lag 1 sparing. Further, T2 error analysis showed that participants reported “neutral” emotion for all the trials in which T2 was missed (Lag 2–3). Similar results were found for emotional verbs (Keil and Ihssen, 2004), where high arousal pleasant and unpleasant verbs as T2 led to attenuated blink, in comparison to low arousal verbs.
ARTICLE IN PRESS 2 Emotions and AB
Along similar lines, the role of emotions in modulating blink magnitude has also been studied using schematic faces. When schematic faces displaying anger, happy, or neutral expressions were used as T2 (with neutral face as T1 and scrambled faces as distractors), attenuated blink was observed for angry faces compared to happy or neutral faces at Lag 1 (Maratos et al., 2008). Also, in comparison to later lags there was a significant reduction at Lag 1 for angry faces. But, in another study (flower symbols as T1 and stimuli with face-like outline and filled with nonsense symbols as distractors), it was shown that happy faces attenuated blink more strongly than angry or neutral faces as T2 (Miyazawa and Iwasaki, 2010). Similarly happy faces were identified better at all lags in comparison to neutral faces (T1 red objects, distractors black objects) (Mack et al., 2002). While emotional content influenced the magnitude of AB compared to nonemotional content, the results using schematic faces are not conclusive in terms of the effect of specific emotional content (positive or negative valence) on AB. Recent studies have used more ecologically valid real emotional faces as stimuli instead of schematic faces. Faces displaying fearful, happy, or neutral expressions as T2 also resulted in attenuated blink (T1 faces with green tint and scrambled faces as distractors), which was more pronounced for fearful faces in comparison to happy faces and subsequently for neutral faces (Milders et al., 2006). Further, T1 (neutral face) performance was better at Lag 2 when T2 was happy rather than fearful emotion, suggesting a backward blink. In another experiment from the same study (Milders et al., 2006), when neutral T2 faces were aversively conditioned (shock), there was an enhancement in T2 detection (attenuated blink magnitude). Alternatively, in a restricted sample of high and low socially anxious women (de Jong et al., 2009), it was shown that both happy and angry real faces as T2 (alphabets as T1 and inverted neutral faces as distractors) resulted in attenuated blink and the length (or dwell time) of blink duration was smaller for emotional faces at close T1–T2 temporal distance than neutral T2. Moreover, in this study, no Lag 1 sparing or backward blink was observed. In another study, using an attentional dwell time paradigm (T1 and T2 presented at different spatial locations without distractors), happy T2 faces were identified more accurately than sad T2 faces at short temporal lags with a neutral T1 (letters) (Srivastava and Srinivasan, 2010). Better performance for happy faces was explained using differences in scope of attention associated with emotions (Fredrickson and Branigan, 2005; Wadlinger and Isaacowitz, 2006). In general, positive emotions are associated with broad scope of attention (distributed attention) and negative emotions are associated with narrow scope of attention (Srivastava and Srinivasan, 2010), and hence with broader scope of attention T2 detection for happy faces was better compared to sad faces. Various studies have used event-related potentials (ERP) to understand the underlying mechanisms of emotional processing over time. Early sensory facilitation has been observed (120–270 ms post T2 onset) in the posterior brain areas for emotional verbs as T2, following appearance of a neutral T1 stimulus (Keil et al., 2006). In line with the two-stage account of AB (Chun and Potter, 1995), the authors suggest
291
ARTICLE IN PRESS 292
Emotion and temporal attention
preferential selection of emotional information at early stage 1 leading to subsequent facilitation for consolidation and report at stage 2. They also showed that correct T2 identification led to significantly enhanced amplitude for T2 in comparison to T1, while incorrect T2 led to reduced T2 amplitude and enhanced T1 amplitudes, suggesting sharing of attentional resources between targets depending on their successful identification. In another study (Luo et al., 2010) observers performed a dual-task with T1 as a house stimuli and T2 as a face stimuli (happy, fear, and neutral), with inverted neutral faces as distractors. Behaviorally, they showed that during the blink period detection of fearful faces was more accurate than happy faces or neutral faces. They proposed a three-stage model of emotion processing in time. The first stage (early P1 and N100 components) depicts facilitation (automatic processing) for threatening information like fear, the second stage marked by vertex positive potential (VPP) component and N170 component in frontocentral and parietooccipital sites (at 240 ms) represents the distinction between emotional faces vs nonemotional faces, and the third stage (marked by P300 and N300 ERP components) represents between emotion category differences (happy vs fear). They suggest that for different emotional categories, lack of availability of attentional resources results in reduced amplitude and shorter latencies during the blink period at second and third stages and not at first stage of emotion processing. While in general, emotional stimuli as T2 showed less blink, the effect of specific emotions on AB magnitude has been mixed. Some have shown that negative faces are better identified (Milders et al., 2006); others have shown that positive faces are better identified (Srivastava and Srinivasan, 2010); few others have found no effect of valence (de Jong et al., 2009). Studies that use emotional words strongly incline toward specific arousal-based modulation rather than just valence-based explanations. No such specific arousal-based modulation has been reported for schematic faces or real faces in the above studies, which could be a reason for conflicting findings for positive and negative emotions. The discrepancies in results with the three types of stimuli used could also be due to task instructions, emotional arousal, type of T1 targets, types of distractors, and sample population used (de Jong et al., 2009; Milders et al., 2006). Various explanations have been proposed for the observed emotional effects. Some suggest that the emotional effects are driven by preferential access of emotional information to the limited capacity system (Keil and Ihssen, 2004). This preferential access is due to the arousal value of the emotion (Anderson, 2005; Milders et al., 2006) leading to less attentional demands or low activation thresholds (de Jong et al., 2009; Shapiro et al., 1997) for consolidation in working memory. Others have suggested that perceptual saliency could lead to preferential bottom-up perceptual processing of emotional stimuli rather than top-down modulation (Anderson, 2005; de Jong et al., 2009; Mack et al., 2002; Miyazawa and Iwasaki, 2010). Few others suggest that differences in emotional valence result in changes in scope of attention (Srivastava and Srinivasan, 2010), or longer persistence of emotional information during the consolidation stage, or automatic processing (threat cues) that would require less attentional resources (Maratos et al., 2008).
ARTICLE IN PRESS 2 Emotions and AB
2.2 BLINK WITH EMOTIONAL T1 2.2.1 Blink When an Emotional T1 Has to Be Identified Initial studies manipulated the emotional content of only T2 to study whether emotions modulate the blink duration or magnitude in the RSVP task. But the effect of emotional modulations on the perceptual representations of subsequent target items has also been studied by manipulating emotional content of T1. Such studies have used either a dual-task design (reporting both T1 and T2) or a specific type of single-task design where before a particular target an emotional distractor is presented (emotion-induced blindness). With respect to emotional words, taboo and/or sexual words having high arousal value as T1 have been shown to increase blink magnitude (greater reduction in T2 performance) at short T1–T2 temporal lags compared to negative, neutral, or positive T1 words (Arnell et al., 2007; Mathewson et al., 2008). Further, when T1 was either a happy or angry schematic face and T2 was a neutral stimulus, the magnitude of the blink increased (de Jong et al., 2010). Moreover, angry T1 increased the blink significantly compared to happy or neutral T1. These results are similar to when emotional words are used as T1 targets. Another study (scrambled scenes as distractors) showed that fearful faces as T1 led to greater reduction in neutral T2 identification (picture task) than neutral face as T1 (Stein et al., 2009). Few studies have also addressed the effects of emotional T1 on the duration of the blink along with magnitude of the blink (Maratos, 2011). The results revealed that though angry faces as T1 showed greater blink magnitude than other emotional faces, the recovery of blink (dwell time) was earliest for these faces indicating that angry faces do capture attention, but it is easier to disengage from them (Maratos, 2011). Similar to using words and schematic faces, when real emotional faces were used as T1 in an attentional dwell time paradigm, neutral T2s following a happy T1 was detected better than sad T1 at short temporal lags (Srivastava and Srinivasan, 2010). Thus, the blink magnitude following happy faces was reduced. The paradigm used by the authors was slightly different from the standard RSVP task wherein they presented only two targets, followed by a mask at different spatial location. The results have been explained in terms of attentional demands and scope of attention (distributed vs focused) of different emotional stimuli used; that is, happy faces might require less attentional resources than sad faces, and hence resources are available for neutral T2 identification (Srivastava and Srinivasan, 2010). In terms of scope of attention, distributed attention enabled better processing of a neutral target presented along with happy face compared to a sad face. Thus, when T1 is emotional (rather than T2) and has to be identified, it leads to greater reduction in T2 performance; when T2 is emotional, it leads to enhanced detection or attenuation of the blink. The effects of emotional T1 have been explained using two-stage model of attentional resource utilization (Chun and Potter, 1995), which suggests that emotional T1 captures more attention and/or is preferentially encoded leading to poorer identification of subsequent T2. A possible explanation has been enhanced stage 1 representation of emotional stimuli (especially high
293
ARTICLE IN PRESS 294
Emotion and temporal attention
arousal T1 taboo stimuli) that might lead to sustained attention during stage 2 leading to less availability of attentional resources for T2 processing (Arnell et al., 2007; Mathewson et al., 2008) or attraction of attentional resources due to specific emotional information like threat and longer time to disengage from them (Stein et al., 2009).
2.2.2 Blink When an Emotional T1 Does Not Have to Be Identified A number of studies have been conducted in the past decade where an emotional T1 acted as a distractor rather than to be reported target, and its influence on the subsequently presented neutral target stimuli was examined. Examples include studies on the effect of an irrelevant highly arousing emotionally charged distractor on subsequent target processing in a single-task RSVP design (McHugo et al., 2013; Most et al., 2005). Participants were asked to report the orientation of the subsequently presented neutral target picture (image rotated left or right). The mere presentation of emotional stimuli produced a deficit in awareness of the subsequent neutral target stimuli at short temporal lags, similar to that observed in AB studies. This reduction has been termed as emotion-induced blindness (EIB). The effects of EIB seem to be similar to EAB except that the emotional stimuli are task-irrelevant distractors. Results from ERP studies using such paradigms have suggested that the EIB effects can be explained using capacity limitation theory (Kennedy et al., 2014; MacLeod et al., 2017). High arousal emotional pictures as distractors in EIB studies (Kennedy et al., 2014) were associated with a trade-off in the amplitudes of N2–P3b components. The N2 component is known to reflect attentional orienting toward the target, while P3 is known to reflect consolidation in working memory (Kennedy et al., 2014; Pincham and Szucs, 2012; Reiss et al., 2008; Sergent et al., 2005; Shapiro et al., 2006). At shorter lags, enhanced P3b amplitude is observed for emotional distractors and consequently reduced P3b during the blink period when the target is missed, whereas in trials where target is reported P3b is enhanced in magnitude in comparison to the distractor. At longer lags when both the targets are reported no difference is found in P3b component for either emotional distractor or the target. Similar trade-offs are shown for the N2 component. In contrast, some studies using emotional faces as distractors do not show reduction in subsequent target identification. Fearful faces presented as emotional distractor (Exp 3) preceding the target did not cause a reduction in target identification (Stein et al., 2009). Further in a limited sample of socially anxious women, happy and angry faces as irrelevant emotional distractor also did not cause a reduction in target performance (for shorter temporal lags) in comparison to neutral or inverted face as emotional distractor (amidst distractors that were matched in similarity to the targets) (de Jong et al., 2014). In fact, there was enhanced performance at short temporal lags relative to longer lags (de Jong et al., 2014). Another study showed that if the preceding distractor was a highly arousing negative stimulus, while the upcoming target was associated with reward, the target was immune to deterioration in identification (Yokoyama et al., 2015). These contrasting findings have been supported by multiple explanations. The first being that if the emotional distractor (preceding
ARTICLE IN PRESS 2 Emotions and AB
the target) is processed semantically (rather than perceptually) and both the distractor and target belong to similar stimulus categories (e.g., pictures), then there will be a deterioration in target performance. Further if the distractor is highly arousing it might still lead to greater reduction in target performance (Most et al., 2005, 2007; Smith et al., 2006). However, when the distractor and target are different category stimuli (distractor being face and target being picture), it will not lead to a reduction in target performance (Arnell et al., 2007; Huang et al., 2008). Further it is also suggested that the effect of emotional distractor on subsequent target processing may depend on attentional control and task set (Stein et al., 2009). In comparison to standard dual-task design for reporting of T1 and T2 targets, the EIB paradigm shows that even when T1 does not have to be reported, it results in blinks of similar durations observed in AB studies. One explanation has been that emotional stimuli automatically capture attention and hence initiate a bottom-up saliency map leading to more resources involved in its consolidation, leaving fewer resources available for the neutral target processing. Similarities and differences have been suggested between AB and EIB with emotional stimuli as targets using behavioral and ERP measures.A suggested difference between standard EAB and EIB has been allocation of attention. EAB is suggested to involve goal-oriented (top-down) attention, while EIB is shown to involve stimulus driven (bottom-up) attention (MacLeod et al., 2017; McHugo et al., 2013). In a novel study, Nı´ Choisdealbha et al. (2017) used a combination of AB with emotional stimuli and EIB to study whether the mechanisms underlying the two effects are similar or different. In their design the T1 was a rotated landscape/architectural picture and the T2 was an emotional stimulus depicting people with either gruesome/neutral/ erotic content within upright neutral distractor pictures depicting landscape/architectural scenes. The main idea was to see whether AB or EIB effects would be similar if T1 and T2 stimuli belong to different stimulus categories (picture and emotion). The first experiment was similar to general EIB studies: an emotional distractor was followed by a target after two stimuli (Lag 2) or after eight stimuli (Lag 8), and participants had to report orientation of the target (rotated landscape—left or right). Here, they observed the standard EIB effect; that is, there was a reduction in target orientation report at Lag 2, and the effect was more when the emotional distractor was an erotic stimulus. The second experiment was a standard AB dual task where the authors measured T2 performance as well as retroactive blink (or also called as backward blink), that is, reduction in T1 performance due to T2 identification. Here, T1 was a rotated landscape and T2 was an emotional target. Participants had to identify both T1 and T2. No blink was found for T2 when T1 had to be reported. A backward blink (T1 cost) was observed at Lag 2 (rather than at Lag 1) when T2 was either erotic or gruesome. One of the reasons for extended backward blink could be that T1 and distractors were matched for low-level properties. This is similar to observations where greater blink is observed when the features of T2 match with distractors, suggesting that similarity with the distractors could be a reason for either backward blink or normal forward blink. The differential effect of no blink for T2 (when both T1 and T2 have to be
295
ARTICLE IN PRESS 296
Emotion and temporal attention
reported) while the EIB effect still being present has been attributed to the different categories of stimulus set used for T1 and T2 suggesting that there is no central capacity limitation but rather different sets of capacities exist and hence parallel processing for different types of stimuli. Other effects have been explained in terms of valence (erotic vs gruesome vs neutral) and arousal of the emotion used, location of targets, and overall nature of the task (single vs dual). Similarly using an ERP paradigm, a recent study (MacLeod et al., 2017) has reported neurophysiological similarities between AB and EIB, when T1 is emotional. In both the conditions (AB and EIB), taboo and sexual words (in comparison to other emotion categories) led to larger blinks at short T1–T2 lags, associated with enhanced early posterior negativity (EPN, 200 ms) and enhanced late positive potential (LPP, 300–800 ms at central electrode sites) component. EPN reflects early selection and semantic processing for entry into WM stage, while LPP is similar to P3 component reflecting consolidation in WM (Kennedy and Most, 2015; Schupp et al., 2006; Vogel and Luck, 2002). They also report that such enhancement is the result of a trade-off between T1 and T2 for correct T2 identification and is strongly correlated with the arousal value of the emotional T1. They explain their findings with the central capacity-limited model (Chun and Potter, 1995) in which the arousal value of T1 and its task relevance (in AB or EIB) leads to impaired processing of a neutral T2. The above studies highlight multiple findings when emotions are used as T1 but not to be reported targets. First of all, in the majority of studies, when T1 is a high arousal distracting emotional picture, it hampers the processing of subsequent neutral picture identification. In general the effects in EIB tasks have been explained using enhanced representation and increased competition of emotional stimuli at stage 1 rather than a central capacity-limited stage 2 bottleneck (McHugo et al., 2013; Wang et al., 2012). When emotional faces are used instead of pictures, no such reduction in T2 performance is observed. Further, such faces may lead to enhanced detection of T2 at short temporal lags, than longer lags. It is also suggested that different attentional sets may exist for different types of stimuli that may lead to either bottom-up capture of attention or differential allocation of attentional resources.
2.3 BLINK WHEN BOTH T1 AND T2 ARE EMOTIONAL Given that separately manipulating the content and nature of emotional information of T1 or T2 has differential effects on temporal representation of subsequent stimuli, some studies have investigated AB by simultaneously manipulating the emotional and arousal content of both T1 and T2. One such study was conducted on a limited sample of socially anxious women (de Jong and Martens, 2007) that manipulated the emotional content of both T1 and T2 targets simultaneously using happy or angry emotional faces (with distractors as neutral inverted faces). The combination of T1 and T2 emotion could be segregated to two grouping conditions: congruent (happy T1 with happy T2; angry T1 with angry T2) and incongruent condition (happy T1 with angry T2; angry T1 with happy T2). They had three novel
ARTICLE IN PRESS 2 Emotions and AB
observations. First, the angry T2 was detected more accurately than happy T2 suggesting anger superiority effect. Second, T2 performance was better for congruent condition (both happy or both angry targets) than incongruent condition. Finally, angry T2 hampered the identification of happy T1 emotion at short temporal lags (a phenomenon termed as “backward blink”). In yet another study using words (Ihssen and Keil, 2009) the changes in emotional valence (pleasant and unpleasant) were matched for arousal parameter for the two emotional target (T1 and T2) words (in a RSVP stream of neutral words). Controlling for the arousal did not show differences between congruent and incongruent condition, compared to the previous study (de Jong and Martens, 2007). Further, no backward blink was observed. The decrement observed in T2 performance when it was preceded by an unpleasant relative to pleasant T1 could be taken as support for anger superiority effect, suggesting that negative valence stimuli are prioritized in comparison to positive valence stimuli. These two studies taken together suggest that arousal parameter plays an important role in the dynamics of AB when both T1 and T2 are emotional. Alternatively the differences in the above two studies in T2 performance and observation of a backward blink could be attributed to the participant pool— socially anxious women (de Jong and Martens, 2007) in comparison to normal participants (Ihssen and Keil, 2009); socially anxious people are known to have bias for negative valence stimuli (for review, see Mathews and MacLeod, 2002). Also in a recent study, it was demonstrated that the magnitude of blink was significantly greater when both the targets were negative valence (aversive) words compared to when T2 was a negative valence (aversive) and T1 was a neutral word (Schwabe and Wolf, 2010). Further, a recent fMRI study investigated whether similar networks and brain areas are responsible when both T1 and T2 are emotional or anyone of them is emotional and other is neutral (Schwabe et al., 2011). The results demonstrate that emotional in comparison to neutral (either T1 or T2) target led to greater activations in amygdala and orbitofrontal cortex. Moreover, emotional T1 that led to reduction in accuracy for emotional T2 was associated with greater activity in anterior cingulate cortex (ACC), insula, and orbitofrontal cortex. Enhanced detection of emotional T2 (when T1 was neutral) was associated with enhanced activation in amygdala. Their results are in line with previous studies that show that damage to amygdala impairs the benefits of emotion processing (Anderson and Phelps, 2001). In line with the two-stage model of AB, it is suggested that different neural structures may mediate stage 1 and stage 2 processing (Chun and Potter, 1995). The authors in this study associate amygdala with the capture of attention (stage 1) and other areas for holding of attention (stage 2). The fundamental result of deterioration of T2 performance when the two targets are emotional stimuli can be explained by both inhibition (Raymond et al., 1992) and two-stage model (Chun and Potter, 1995). Both the models suggest that T2 cannot be processed until T1 processing is complete, which leads to decrement in T2 performance for a finite time period corresponding to the duration of AB. Other observations like dependence of the magnitude of AB on valence and arousal parameters of emotional stimuli and the presence of backward blink cannot be explained by
297
ARTICLE IN PRESS 298
Emotion and temporal attention
inhibition model (Raymond et al., 1992). The two-stage model (Chun and Potter, 1995) is more suitable in explaining the above results. The two-stage model (Chun and Potter, 1995) proposes that stage 1 (target identification) is faster and stage 2 (target consolidation) is relatively slower and acts as the rate-determining stage. Thereby the magnitude and duration of AB would depend upon how long stage 2 lasts. The consolidation stage (stage 2) for emotional perception would constitute mapping target features (identified in stage 1) to different emotional categories (behavioral responses) and make them available in the working memory. Given that the stimuli in Ihssen and Keil (2009) were matched for arousal parameters, the absence of overall difference in AB magnitude for congruent and incongruent condition suggests that stage 2 processing could be independent of the valence of the two target stimuli. On the other hand better performance for congruent as compared to incongruent conditions in de Jong and Martens (2007), given that stimuli in their study were not matched for arousal parameter, indicates that the arousal parameter may influence stage 2 processing. Arousal can be considered as additional information that adds to the processing and thereby facilitates (congruent condition) or impede (incongruent condition) consolidation of the T1 stimuli to a percept, thereby affecting T2 performance. A relatively greater magnitude of AB when T1 was a negative compared to positive valence stimuli (Ihssen and Keil, 2009) would suggest that more attentional resources are expended when T1 is of negative valence, thereby affecting T2 processing more severely compared to when T1 is of positive valence. Thus, when both T1 and T2 are emotional, the results could be explained in terms of resource depletion account (Chun and Potter, 1995). If emotional T1 targets hold attention, fewer resources are available to process emotional T2, whereas if T1 is neutral, more attentional resources are available to process emotional T2 (reduced blink) (Schwabe and Wolf, 2010). An alternate explanation is that angry faces have low detection threshold and prioritized access to the limited cognitive resources within a particular time (de Jong and Martens, 2007). Thus, the above studies suggest that the nature of content in T1 targets is important in modulating AB as well as capturing and holding attention (Schwabe and Wolf, 2010).
2.4 RSVP IN AN EMOTIONAL CONTEXT The previous studies showed that depending on whether an emotional stimulus is used as T1 and/or T2, processing of T2 is enhanced or reduced during the blink window. These results were explained predominantly using attentional theories based on capacity limitations; that is, emotional stimuli attract attention leading to less availability of attentional resources for other targets. This is consistent with capacity limitation theories which argue for a reduction in T2 performance, if attentional resources are exhausted by T1. However, various studies using contextual emotional manipulations like listening to music or thinking of a positive event (Olivers and Nieuwenhuis, 2005) while doing the AB task or doing a concurrent secondary task do not reveal such effects. Rather the results showed that such contextual secondary
ARTICLE IN PRESS 3 Explanations for emotional effects
tasks led to better T2 performance rather than worsening it. In another study (Olivers and Nieuwenhuis, 2006), positive, negative, and neutral pictures were presented in between RSVP trials. The results showed that T2 performance was enhanced for trials interspersed with positive stimuli relative to negative or neutral stimuli between AB trials. Moreover, positive images led to better T1 performance too. Similarly other studies report that overall positive mood/affect increases T2 performance, while overall negative mood/affect reduced T2 performance (increases blink) (Vermeulen, 2010). Further, presentation of fearful faces before AB task increased the magnitude of AB, while disgust faces attenuated the blink magnitude (Vermeulen et al., 2009). The authors argue that fearful faces enhance the allocation of attentional resources in RSVP stream, while disgust faces reduce the allocation of attention. Also, few of the above studies report the presence of Lag 1 sparing and backward blink at shorter lags (Olivers and Nieuwenhuis, 2006; Vermeulen, 2010). These results have been explained using a combination of various recent AB theories and positive affect psychology. According to overinvestment hypothesis of AB, greater AB arises due to overinvestment of attentional resources in the given task and distribution of attentional focus (Olivers and Nieuwenhuis, 2005). Therefore, doing a concurrent secondary task results in less interference from irrelevant distractors during the consolidation stage and enhances T2 performance. According to positive affect hypothesis (Johnson et al., 2010; Olivers and Nieuwenhuis, 2006) positive affect could broaden the scope of attention and increase cognitive flexibility, widening the window of attention and increasing short-term memory storage. Thus, both the overinvestment hypothesis and positive affect hypothesis may together suggest that when attention is diffused, attenuated blink is observed, while when focused attention is focused on the contents of the task more resources are available to process distractors leading to interference during consolidation in WM (MacLean et al., 2010; Olivers and Nieuwenhuis, 2005, 2006; Srivastava and Srinivasan, 2010). Alternatively, such results have also been explained using boost and bounce theory (Olivers and Meeter, 2008). According to this theory, negative affect causes larger bounce (narrowed attention and greater inhibition of distractors) leading to greater blink than positive affect (Vermeulen et al., 2009) or that positive affect reduces the interference from distractors leading to better T2 performance than negative affect (Vermeulen, 2010).
3 EXPLANATIONS FOR EMOTIONAL EFFECTS: CAPACITY LIMITATIONS, ATTENTIONAL SET, OR PERCEPTUAL EPISODES? The above studies show that emotions influence the magnitude of AB as well as T1 performance (backward blink) depending on where they are presented in the RSVP stream and their temporal distance from each other. Studies, in general, show that magnitude of AB is attenuated when T2 is an emotional and T1 is a neutral stimulus (Anderson, 2005; Keil and Ihssen, 2004; Maratos et al., 2008). When T1 is emotional, further reduction in T2 performance is observed, that is, dependent on task set
299
ARTICLE IN PRESS 300
Emotion and temporal attention
(identify T1 or not). On the other hand when both the targets are emotional stimuli, then the magnitude of AB depends on the congruency between T1 and T2 and its associated arousal value (de Jong and Martens, 2007; Ihssen and Keil, 2009). The results for emotional valence seem to be mixed with some studies suggesting happy superiority effect while others suggesting angry/threat superiority effects. Few studies observe reduction in T1 performance dependent on the nature of T2 stimuli, while rarely Lag 1 sparing is observed in EAB (Anderson, 2005). Further, when T1 and T2 are different sets of stimuli (faces and pictures), no reduction in T2 is observed. These results have been predominantly explained with a two-stage AB model (Chun and Potter, 1995). First, the attenuated AB when T2 is emotional and T1 is a neutral stimulus could be attributed to less processing resources utilized by T1. This is because T1 being neutral has minimal or no arousal or valence parameters, which might require less processing resources in the second consolidation stage, rendering them available to the emotional T2 even at shorter lags. Second, it could also be that the saliency of emotional stimuli allows them prioritized access to the processing resources disrupting the consolidation stage of the T1 processing. This would lead to attenuated AB but would also cause backward blink, since T1 processing is disrupted. The majority of the studies using emotional stimuli as T2 show no Lag 1 sparing, except when the features of T1 and T2 (green targets, Anderson, 2005) are matched. This suggests that Lag 1 sparing is dependent on the perceptual similarity between the two targets, at least in that context. Such effects can be explained using the earlier inhibition model (Raymond et al., 1992) where the characteristic of T1 + 1 determines the Lag 1 sparing (rather than the temporal distance between them). However, with respect to emotional faces—schematic or real, no sparing has been observed. An alternative to the two-stage model (Chun and Potter, 1995) is attentional control theory, which attempts to explain various novel observations that are not well explained by capacity limitation theories (Martens and Wyble, 2010). It suggests that T1 in a given task triggers an attentional set, and when T2 is presented close in time, there is attentional suppression arising due to T1 consolidation in WM that leads to poor identification of T2. This has been described using the episodic simultaneous type/serial token (eSTST) model, which proposes that AB arises due to the dynamic interplay of temporal attention and working memory consolidation (Wyble et al., 2009). The basis of the model lies in three components: types, tokens, and temporal attention. The types are the representation of target features for extraction that are serially associated with the token information, which represents the episodic registration of temporal order of target presentation in working memory. Presentation of first target (T1) triggers a brief transient attentional window that leads to attentional enhancement of the target. AB results due to ongoing encoding of the targets in that attentional episode and thus suppression/inhibition of subsequent items. The computational model accounts for Lag 1 sparing as well as order errors and temporal conjunctions. Under this model, an emotional stimulus when presented as T2 could lead to inhibition of suppression due to its salience and attention capture properties (when T1 is neutral), thus leading to reduction in blink magnitude. Further, some
ARTICLE IN PRESS 3 Explanations for emotional effects
studies where T2 is emotional (especially high arousal) show backward blink. This can be explained in terms of this model where higher arousal of emotional T2 attracts attention (stronger episodic representations) and is able to break the attentional suppression (arising due to T1 consolidation) leading to hampered consolidation of preceding target (T1) (Kandemir et al., 2017). Even though this model could explain Lag 1 sparing and T1 cost for emotional stimuli, it is difficult to incorporate the nature of emotional information (valence vs arousal) in this model. Another recent theory that could possibly explain the different observations is the limited snapshots theory (see chapter “Perceptual episodes, temporal attention, and the role of cognitive control: Lessons from the attentional blink” by Snir and Yeshurun), which argues that processing of rapidly presented targets in the temporal domain can be explained in terms of perceptual episodes and cognitive control mechanisms (proactive and reactive) that depend on nature of stimuli and task demands; rather than the attentional limitations. They argue that grouped episodic representations are automatically created when multiple stimuli are presented close in time, while the role of attention is to create a “snapshot” of these representations that help in individuation and appropriate reporting of targets. In terms of emotional information used as either T1 or T2 or as distractors, various cognitive control mechanisms can come into play. In this sense, certain emotions could be thought to expand the temporal window, while others could reduce it. T1 cost/backward blink is accounted by the fact the various snapshots created over time get subsequently degraded as new snapshots are being created that might increase the probability that T1 could not be extracted from the previous snapshot. Very few studies using emotional stimuli as targets have reported the kind of errors participants make whenever T2 is inaccurate (Anderson, 2005; Raymond et al., 1992). This can be divided into inversion errors (reported T1 content as T2) or reporting no awareness of T2, or if a third option is given then reporting T2 as neutral rather than any emotion. It has been reported that most of the T2 errors made by the participants are reporting T2 as neutral (Anderson, 2005). In this study the T1 and the distractors were neutral words of different length. Thus, the errors could be either inversion errors or distractor interference during the blink period. If the limited snapshot theory is considered, the T2 errors could be explained as resulting from binding within the same attentional episode (Snir and Yeshurun, under review). It has also been suggested that AB does not result from a unitary process but arises out of the interplay of multiple processes (Dux and Marois, 2009). The multiple processes involve saliency (bottom-up factor), task instruction set (top-down attention), triggering of an attentional episode, and grouping within the same attentional window (Dux and Marois, 2009). They suggest that all the stimuli initially undergo full processing up to conceptual stages, but the strength of processing depends on the saliency of the target (bottom-up) and its similarity with the distractors. Greater similarity would lead to greater masking and thus weaker representations. With the onset of the first target, an attentional episode is triggered by task instructions. This attentional episode results in enhancement of T1 + 1 target/distractors due to the temporal dynamics of attentional deployment (Lag 1 sparing). Further, stimuli
301
ARTICLE IN PRESS 302
Emotion and temporal attention
in the same attentional window compete for representations at higher stages of processing where the winner undergoes episodic registration, consolidation, selection, and response.
4 FUTURE SCOPE Emotional stimuli are shown to modulate the time course of temporal representations of targets. These modulations depend on the valence and the arousal content of emotions, with arousal dominating the observed effects. Resource depletion accounts have been used to explain the majority of results that include results from ERP and neuroimaging studies (Keil et al., 2006; Kennedy et al., 2014; McHugo et al., 2013) rather than in terms of temporal windows or perceptual episodes, which should be an important aspect of future studies. However, many questions remain unresolved. First, most studies using emotional T1 or T2 do not show Lag 1 sparing (except for a few, for e.g., Anderson, 2005) and the reasons are not clear. Second, as recent AB studies show, extended sparing has not yet been studied using emotional stimuli. Extended sparing means correctly identifying three targets following T1 (Di Lollo et al., 2005; Olivers et al., 2007). Also recent studies highlight the significance of individual differences in AB studies (Slagter and Georgopoulou, 2013; Willems and Martens, 2016) and with respect to emotion such differences still need to be thoroughly investigated (Anderson, 2005; Sussman et al., 2013). Future experiments can also look into individual differences and further systematic studies in patients to better understand the mechanisms involved in EAB. Further studies could also focus on the role of emotional information in understanding temporal episodes in clinical and patient populations (Boraston et al., 2007; Corden et al., 2008; Gaigg and Bowler, 2009; Grynberg et al., 2014; Olatunji et al., 2013; Sch€onenberg and Abdelrahman, 2013; Yerys et al., 2013). Current studies focus on the link between perception and biology of perceptual modulations by understanding the role of neuromodulators and gene-related alterations that underlie the differences in standard and EAB mechanisms. It is suggested that neuromodulatory changes due to alterations in neurotransmitters may underlie decreased processing of T2 targets in standard AB tasks as well as EAB tasks, e.g., neurotransmitters like dopamine for general AB modulation (Colzato et al., 2008, 2011) and norepinephrine specific for emotion processing during the blink window (De Martino et al., 2008; Nieuwenhuis et al., 2005) have been highlighted. Some studies also suggest that AB due to emotional words in dual-target RSVP tasks could be due to involvement of different genetic factors (Todd et al., 2013). They suggest a deletion variant in the gene ADRA2B (known to regulate norepinephrine neurotransmitter in the amygdala) that may be responsible for enhanced detection of emotional events. A novel model, biased attention via norepinephrine, has been proposed to account for neuromodulation and a genetic role for emotion-specific modulations that affect various cognitive processes including EAB (Markovic et al., 2014).
ARTICLE IN PRESS References
The model incorporates findings from various imaging, behavioral, and neural studies and suggests contribution of the LC–NE (locus coeruleus and norepinephrine) system in understanding the facilitatory role of emotion in attention and memory processes. Future studies could specifically address the contribution of this model in linking EAB, EIB, and standard AB studies.
REFERENCES Adolphs, R., 2002. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav. Cogn. Neurosci. Rev. 1 (1), 21–62. SAGE Publications. https:// doi.org/10.1177/1534582302001001003. Anderson, A.K., 2005. Affective influences on the attentional dynamics supporting awareness. J. Exp. Psychol. Gen. 134 (2), 258–281. https://doi.org/10.1037/0096-3445.134. 2.258. Anderson, A.K., Phelps, E.A., 2001. Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature 411 (6835), 305–309. Nature Publishing Group. https://doi.org/10.1038/35077083. Arnell, K.M., Killman, K.V., Fijavz, D., 2007. Blinded by emotion: target misses follow attention capture by arousing distractors in RSVP. Emotion 7 (3), 465–477. https://doi.org/ 10.1037/1528-3542.7.3.465. Becker, D.V., Anderson, U.S., Mortensen, C.R., Neufeld, S.L., Neel, R., 2011. The face in the crowd effect unconfounded: happy faces, not angry faces, are more efficiently detected in single- and multiple-target visual search tasks. J. Exp. Psychol. Gen. 140 (4), 637–659. https://doi.org/10.1037/a0024060. Bocanegra, B.R., Zeelenberg, R., 2011. Emotional cues enhance the attentional effects on spatial and temporal resolution. Psychon. Bull. Rev. 18 (6), 1071–1076. Springer. https://doi. org/10.3758/s13423-011-0156-z. Boraston, Z., Blakemore, S.-J., Chilvers, R., Skuse, D., 2007. Impaired sadness recognition is linked to social interaction deficit in autism. Neuropsychologia 45 (7), 1501–1510. https://doi.org/10.1016/j.neuropsychologia.2006.11.010. € H., 2006. Visual Masking: Time Slices Through Conscious and Breitmeyer, B.G., Oğmen, Unconscious Vision. Oxford University Press, New York, USA. Broadbent, D.E., Broadbent, M.H.P., 1987. From detection to identification: response to multiple targets in rapid serial visual presentation. Percept. Psychophys. 42 (2), 105–113. Springer-Verlag. https://doi.org/10.3758/BF03210498. Brosch, T., Scherer, K.R., Grandjean, D., Sander, D., 2013. The impact of emotion on perception, attention, memory, and decision-making. Swiss Med. Wkly. 143, w13786. https:// doi.org/10.4414/smw.2013.13786. Chun, M.M., Potter, M.C., 1995. A two-stage model for multiple target detection in rapid serial visual presentation. J. Exp. Psychol. Hum. Percept. Perform. 21 (1), 109–127. https://doi. org/10.1037/0096-1523.21.1.109. Colzato, L.S., Slagter, H.A., Spape, M.M.A., Hommel, B., 2008. Blinks of the eye predict blinks of the mind. Neuropsychologia 46 (13), 3179–3183. https://doi.org/10.1016/ j.neuropsychologia.2008.07.006. Colzato, L.S., Slagter, H.A., de Rover, M., Hommel, B., 2011. Dopamine and the management of attentional resources: genetic markers of striatal D2 dopamine predict individual
303
ARTICLE IN PRESS 304
Emotion and temporal attention
differences in the attentional blink. J. Cogn. Neurosci. 23 (11), 3576–3585. MIT Press55 Hayward Street, Cambridge, MA. https://doi.org/10.1162/jocn_a_00049. Corden, B., Chilvers, R., Skuse, D., 2008. Emotional modulation of perception in Asperger’s syndrome. J. Autism Dev. Disord. 38 (6), 1072–1080. Springer US. https://doi.org/10. 1007/s10803-007-0485-y. de Jong, P.J., Martens, S., 2007. Detection of emotional expressions in rapidly changing facial displays in high- and low-socially anxious women. Behav. Res. Ther. 45 (6), 1285–1294. https://doi.org/10.1016/j.brat.2006.10.003. de Jong, P.J., Koster, E.H.W., van Wees, R., Martens, S., 2009. Emotional facial expressions and the attentional blink: attenuated blink for angry and happy faces irrespective of social anxiety. Cognit. Emot. 23 (8), 1640–1652. Taylor & Francis Group. https://doi.org/10. 1080/02699930802490227. de Jong, P.J., Koster, E.H.W., van Wees, R., Martens, S., 2010. Angry facial expressions hamper subsequent target identification. Emotion 10 (5), 727–732. American Psychological Association. https://doi.org/10.1037/a0019353. de Jong, P.J., Koster, E.H.W., Wessel, I., Martens, S., 2014. Distinct temporal processing of task-irrelevant emotional facial expressions. Emotion 14 (1), 12–16. https://doi.org/ 10.1037/a0034630. De Martino, B., Strange, B.A., Dolan, R.J., 2008. Noradrenergic neuromodulation of human attention for emotional and neutral stimuli. Psychopharmacology 197 (1), 127–136. Springer-Verlag. https://doi.org/10.1007/s00213-007-1015-5. Di Lollo, V., Kawahara, J., Shahab Ghorashi, S.M., Enns, J.T., 2005. The attentional blink: resource depletion or temporary loss of control? Psychol. Res. 69 (3), 191–200. https:// doi.org/10.1007/s00426-004-0173-x. Duncan, J., Ward, R., Shapiro, K., 1994. Direct measurement of attentional dwell time in human vision. Nature 369 (6478), 313–315. https://doi.org/10.1038/369313a0. Dux, P.E.E., Marois, R., 2009. The attentional blink: a review of data and theory. Atten. Percept. Psychophys 71 (8), 1683–1700. Springer-Verlag. https://doi.org/10.3758/APP.71.8.1683. Ekman, P., Friesen, W.V., Ellsworth, P., 1972. Emotion in the Human Face: Guidelines for Research and an Integration of Findings. Pergamon Press, Inc., New York, USA. Engelmann, J.B., Pessoa, L., 2014. Motivation sharpens exogenous spatial attention. Motiv. Sci. 1 (S), 64–72. Educational Publishing Foundation. https://doi.org/10.1037/23338113.1.S.64. Eriksen, C.W., Spencer, T., 1969. Rate of information processing in visual perception: some results and methodological considerations. J. Exp. Psychol. 79 (2), 1–16. Fredrickson, B.L., Branigan, C., 2005. Positive emotions broaden the scope of attention and thought-action repertoires. Cognit. Emot. 19 (3), 313–332. NIH Public Access. https://doi. org/10.1080/02699930441000238. Gaigg, S.B., Bowler, D.M., 2009. Brief report: attenuated emotional suppression of the attentional blink in autism spectrum disorder: another non-social abnormality? J. Autism Dev. Disord. 39 (8), 1211–1217. Springer US. https://doi.org/10.1007/s10803-009-0719-2. Grynberg, D., Vermeulen, N., Luminet, O., 2014. Amplification of attentional blink by distress-related facial expressions: relationships with alexithymia and affectivity. Int. J. Psychol. 49 (5), 371–380. John Wiley & Sons, Ltd. https://doi.org/10.1002/ ijop.12006. Hansen, C.H., Hansen, R.D., 1988. Finding the face in the crowd: an anger superiority effect. J. Pers. Soc. Psychol. 54 (6), 917–924. https://doi.org/10.1037/0022-3514.54.6.917.
ARTICLE IN PRESS References
Horstmann, G., Bauland, A., 2006. Search asymmetries with real faces: testing the angersuperiority effect. Emotion 6 (2), 193–207. https://doi.org/10.1037/1528-3542. 6.2.193. Huang, Y.-M., Baddeley, A., Young, A.W., 2008. Attentional capture by emotional stimuli is modulated by semantic processing. J. Exp. Psychol. Hum. Percept. Perform. 34 (2), 328–339. https://doi.org/10.1037/0096-1523.34.2.328. Ihssen, N., Keil, A., 2009. The costs and benefits of processing emotional stimuli during rapid serial visual presentation. Cognit. Emot. 23 (2), 296–326. https://doi.org/10.1080/ 02699930801987504. Johnson, K.J., Waugh, C.E., Fredrickson, B.L., 2010. Smile to see the forest: facially expressed positive emotions broaden cognition. Cognit. Emot. 24 (2), 299–321. Taylor & Francis Group. https://doi.org/10.1080/02699930903384667. Jolicoeur, P., 1998. Modulation of the attentional blink by on-line response selection: evidence from speeded and unspeeded Task1 decisions. Mem. Cognit. 26 (5), 1014–1032. SpringerVerlag. https://doi.org/10.3758/BF03201180. € Juth, P., Lundqvist, D., Karlsson, A., Ohman, A., 2005. Looking for foes and friends: perceptual and emotional factors when finding a face in the crowd. Emotion 5 (4), 379–395. American Psychological Association. https://doi.org/10.1037/1528-3542.5.4.379. Kandemir, G., Akyurek, E.G., Nieuwenstein, M.R., Notebaert, L., Iverson, G., Quilodran, R., 2017. Retro-active emotion: do negative emotional stimuli disrupt consolidation in working memory?’. PLoS One 12 (1), e0169927. Edited by L. Chelazzi. MIT Press. https://doi. org/10.1371/journal.pone.0169927. Keil, A., Ihssen, N., 2004. Identification facilitation for emotionally arousing verbs during the attentional blink. Emotion 4 (1), 23–35. American Psychological Association. https://doi. org/10.1037/1528-3542.4.1.23. Keil, A., Ihssen, N., Heim, S., 2006. Early cortical facilitation for emotionally arousing targets during the attentional blink. BMC Biol. 4, 23. BioMed Central. https://doi.org/10. 1186/1741-7007-4-23. Kennedy, B.L., Most, S.B., 2015. The rapid perceptual impact of emotional distractors. PLoS One 10 (6), e0129320. Edited by E. Aky€urek. Consulting Psychologists Press. https://doi. org/10.1371/journal.pone.0129320. Kennedy, B.L., Rawding, J., Most, S.B., Hoffman, J.E., 2014. Emotion-induced blindness reflects competition at early and late processing stages: an ERP study. Cogn. Affect. Behav. Neurosci. 14 (4), 1485–1498. https://doi.org/10.3758/s13415-014-0303-x. Luo, W., Feng, W., He, W., Wang, N.-Y., Luo, Y.-J., 2010. Three stages of facial expression processing: ERP study with rapid serial visual presentation. Neuroimage 49 (2), 1857–1867. https://doi.org/10.1016/j.neuroimage.2009.09.018. Mack, A., Pappas, Z., Silverman, M., Gay, R., 2002. What we see: inattention and the capture of attention by meaning. Conscious. Cogn. 11 (4), 488–506. https://doi.org/10.1016/ S1053-8100(02)00028-4. MacLean, M.H., Arnell, K.M., Busseri, M.A., 2010. Dispositional affect predicts temporal attention costs in the attentional blink paradigm. Cognit. Emot. 24 (8), 1431–1438. Taylor & Francis Group. https://doi.org/10.1080/02699930903417897. MacLeod, J., Stewart, B.M., Newman, A.J., Arnell, K.M., 2017. Do emotion-induced blindness and the attentional blink share underlying mechanisms? An event-related potential study of emotionally-arousing words. Cogn. Affect. Behav. Neurosci. 17 (3), 592–611. Springer US. https://doi.org/10.3758/s13415-017-0499-7.
305
ARTICLE IN PRESS 306
Emotion and temporal attention
Maratos, F.A., 2011. Temporal processing of emotional stimuli: the capture and release of attention by angry faces. Emotion 11 (5), 1242–1247. https://doi.org/10.1037/a0024279. Maratos, F.A., Mogg, K., Bradley, B.P., 2008. Identification of angry faces in the attentional blink. Cognit. Emot. 22 (7), 1340–1352. https://doi.org/10.1080/02699930701774218. Markovic, J., Anderson, A.K., Todd, R.M., 2014. Tuning to the significant: neural and genetic processes underlying affective enhancement of visual perception and memory. Behav. Brain Res. 259, 229–241. https://doi.org/10.1016/j.bbr.2013.11.018. Martens, S., Wyble, B., 2010. The attentional blink: past, present, and future of a blind spot in perceptual awareness. Neurosci. Biobehav. Rev. 34 (6), 947–957. https://doi.org/10.1016/ j.neubiorev.2009.12.005. Mathews, A., MacLeod, C., 2002. Induced processing biases have causal effects on anxiety. Cognit. Emot. 16 (3), 331–354. Mathewson, K.J., Arnell, K.M., Mansfield, C.A., 2008. Capturing and holding attention: the impact of emotional words in rapid serial visual presentation. Mem. Cogn. 36 (1), 182–200. Springer-Verlag. https://doi.org/10.3758/MC.36.1.182. McHugo, M., Olatunji, B.O., Zald, D.H., 2013. The emotional attentional blink: what we know so far. Front. Hum. Neurosci. 7, 151. Frontiers Media SA. https://doi.org/10.3389/fnhum. 2013.00151. Milders, M., Sahraie, A., Logan, S., Donnellon, N., 2006. Awareness of faces is modulated by their emotional meaning. Emotion 6 (1), 10–17. https://doi.org/10.1037/1528-3542.6.1.10. Miyazawa, S., Iwasaki, S., 2010. Do happy faces capture attention? The happiness superiority effect in attentional blink. Emotion 10 (5), 712–716. American Psychological Association. https://doi.org/10.1037/a0019348. Most, S.B., Chun, M.M., Widders, D.M., Zald, D.H., 2005. Attentional rubbernecking: cognitive control and personality in emotion-induced blindness. Psychon. Bull. Rev. 12 (4), 654–661. Springer-Verlag. https://doi.org/10.3758/BF03196754. Most, S.B., Smith, S.D., Cooter, A.B., Levy, B.N., Zald, D.H., 2007. The naked truth: positive, arousing distractors impair rapid target perception. Cognit. Emot. 21 (5), 964–981. Taylor & Francis Group. https://doi.org/10.1080/02699930600959340. ´ ., Piech, R.M., Fuller, J.K., Zald, D.H., 2017. Reaching back: the relative Nı´ Choisdealbha, A strength of the retroactive emotional attentional blink. Sci. Rep. 7, 43645. Nature Publishing Group. https://doi.org/10.1038/srep43645. Nieuwenhuis, S., Gilzenrat, M.S., Holmes, B.D., Cohen, J.D., 2005. The role of the locus coeruleus in mediating the attentional blink: a neurocomputational theory. J. Exp. Psychol. Gen. 134 (3), 291–307. https://doi.org/10.1037/0096-3445.134.3.291. Nieuwenstein, M.R., Potter, M.C., 2006. Temporal limits of selection and memory encoding. Psychol. Sci. 17 (6), 471–475. SAGE Publications Sage CA: Los Angeles, CA. https://doi. org/10.1111/j.1467-9280.2006.01730.x. Nobre, K., Coull, J.T., 2013. Attention and Time. Oxford University Press, New York, USA. Ogawa, T., Suzuki, N., 2004. On the saliency of negative stimuli: evidence from attentional blink1. Jpn. Psychol. Res. 46 (1), 20–30. Blackwell Publishing, Ltd. https://doi.org/10. 1111/j.1468-5884.2004.00233.x. € A., Flykt, A., Esteves, F., 2001. Emotion drives attention: detecting the snake in the Ohman, grass. J. Exp. Psychol. Gen. 130 (3), 466–478. https://doi.org/10.1037/0096-3445. 130.3.466. Olatunji, B. O., Armstrong, T., McHugo, M. and Zald, D. H. (2013) Heightened attentional capture by threat in veterans with PTSD, J. Abnorm. Psychol., NIH Public Access, 122 (2), pp. 397–405. https://doi.org/10.1037/a0030440.
ARTICLE IN PRESS References
Olivers, C.N.L., Meeter, M., 2008. A boost and bounce theory of temporal attention. Psychol. Rev. 115 (4), 836–863. https://doi.org/10.1037/a0013395. Olivers, C.N.L., Nieuwenhuis, S., 2005. The beneficial effect of concurrent task-irrelevant mental activity on temporal attention. Psychol. Sci. 16 (4), 265–269. SAGE Publications Sage CA: Los Angeles, CA. https://doi.org/10.1111/j.0956-7976.2005.01526.x. Olivers, C.N.L., Nieuwenhuis, S., 2006. The beneficial effects of additional task load, positive affect, and instruction on the attentional blink. J. Exp. Psychol. Hum. Percept. Perform. 32 (2), 364–379. https://doi.org/10.1037/0096-1523.32.2.364. Olivers, C.N.L., van der Stigchel, S., Hulleman, J., 2007. Spreading the sparing: against a limited-capacity account of the attentional blink. Psychol. Res. 71 (2), 126–139. https:// doi.org/10.1007/s00426-005-0029-z. Phelps, E.A., Ling, S., Carrasco, M., 2006. Emotion facilitates perception and potentiates the perceptual benefits of attention. Psychol. Sci. 17 (4), 292–299. SAGE Publications. https://doi.org/10.1111/j.1467-9280.2006.01701.x. Pincham, H.L., Szucs, D., 2012. Intentional subitizing: exploring the role of automaticity in enumeration. Cognition 124 (2), 107–116. https://doi.org/10.1016/j.cognition.2012.05.010. Posner, M.I., 1980. Orienting of attention. Q. J. Exp. Psychol. 32 (1), 3–25. Taylor & Francis Group. https://doi.org/10.1080/00335558008248231. Posner, J., Russell, J.A., Peterson, B.S., 2005. The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17 (3), 715. NIH Public Access. https://doi.org/10.1017/s0954579405050340. Potter, M.C., Nieuwenstein, M., Strohminger, N., 2008. Whole report versus partial report in RSVP sentences. J. Mem. Lang. 58 (4), 907–915. NIH public Access. https://doi.org/10. 1016/j.jml.2007.12.002. Raymond, J.E., Shapiro, K.L., Arnell, K.M., 1992. Temporary suppression of visual processing in an RSVP task: an attentional blink? J. Exp. Psychol. Hum. Percept. Perform 18 (3), 849–860. American Psychological Association. https://doi.org/10.1037/0096-1523.18.3.849. Reeves, A., Sperling, G., 1986. Attention gating in short-term visual memory. Psychol. Rev. 93 (2), 180–206. American Psychological Association. https://doi.org/10.1037/0033295X.93.2.180. Reiss, J.E., Hoffman, J.E., Heyward, F.D., Doran, M.M., Most, S.B., 2008. ERP evidence for temporary loss of control during the attentional blink. J. Vis. 8 (6), 12. The Association for Research in Vision and Ophthalmology. https://doi.org/10.1167/8.6.12. Sch€onenberg, M., Abdelrahman, T., 2013. In the face of danger: exploring the attentional blink to emotional facial expressions in PTSD. Psychiatry Res. 209 (2), 180–185. https://doi.org/ 10.1016/j.psychres.2012.11.011. Schupp, H.T., Flaisch, T., Stockburger, J., Jungh€ ofer, M., 2006. Emotion and attention: eventrelated brain potential studies. Prog. Brain Res. 156, 31–51. https://doi.org/10.1016/ S0079-6123(06)56002-9. Schwabe, L., Wolf, O.T., 2010. Emotional modulation of the attentional blink: is there an effect of stress? Emotion 10 (2), 283–288. https://doi.org/10.1037/a0017751. Schwabe, L., Merz, C.J., Walter, B., Vaitl, D., Wolf, O.T., Stark, R., 2011. Emotional modulation of the attentional blink: the neural structures involved in capturing and holding attention. Neuropsychologia 49 (3), 416–425. https://doi.org/10.1016/j.neuropsychologia. 2010.12.037. Sergent, C., Baillet, S., Dehaene, S., 2005. Timing of the brain events underlying access to consciousness during the attentional blink. Nat. Neurosci 8 (10), 1391–1400. Nature Publishing Group. https://doi.org/10.1038/nn1549.
307
ARTICLE IN PRESS 308
Emotion and temporal attention
Shapiro, K.L., Caldwell, J., Sorensen, R.E., 1997. Personal names and the attentional blink: a visual “cocktail party” effect. J. Exp. Psychol. Hum. Percept. Perform. 23 (2), 504–514. Shapiro, K., Schmitz, F., Martens, S., Hommel, B., Schnitzler, A., 2006. Resource sharing in the attentional blink. Neuroreport 17 (2), 163–166. Slagter, H.A., Georgopoulou, K., 2013. Distractor inhibition predicts individual differences in recovery from the attentional blink. PLoS One 8 (5). https://doi.org/10.1371/journal. pone.0064681. Smith, S.D., Most, S.B., Newsome, L.A., Zald, D.H., 2006. An emotion-induced attentional blink elicited by aversively conditioned stimuli. Emotion 6 (3), 523–527. https://doi.org/ 10.1037/1528-3542.6.3.523. Srinivasan, N., Hanif, A., 2010. Global-happy and local-sad: perceptual processing affects emotion identification. Cognit. Emot. 24 (6), 1062–1069. Taylor & Francis Group. https://doi.org/10.1080/02699930903101103. Srivastava, P., Srinivasan, N., 2010. Time course of visual attention with emotional faces. Atten. Percept. Psychophys 72 (2), 369–377. Springer-Verlag. https://doi.org/10.3758/ APP.72.2.369. Stein, T., Zwickel, J., Ritter, J., Kitzmantel, M., Schneider, W.X., 2009. The effect of fearful faces on the attentional blink is task dependent. Psychon. Bull. Rev. 16 (1), 104–109. Springer-Verlag. https://doi.org/10.3758/PBR.16.1.104. Sussman, T.J., Heller, W., Miller, G.A., Mohanty, A., 2013. Emotional distractors can enhance attention. Psychol. Sci. 24 (11), 2322. NIH Public Access. https://doi.org/10.1177/ 0956797613492774. Todd, R.M., M€uller, D.J., Lee, D.H., Robertson, A., Eaton, T., Freeman, N., Palombo, D.J., Levine, B., Anderson, A.K., 2013. Genes for emotion-enhanced remembering are linked to enhanced perceiving. Psychol. Sci. 24 (11), 2244–2253. SAGE Publications Sage CA: Los Angeles, CA. https://doi.org/10.1177/0956797613492423. Vermeulen, N., 2010. Current positive and negative affective states modulate attention: an attentional blink study. Personal. Individ. Differ. 49 (5), 542–545. https://doi.org/10.1016/j. paid.2010.04.003. Vermeulen, N., Godefroid, J., Mermillod, M., 2009. Emotional modulation of attention: fear increases but disgust reduces the attentional blink. PLoS One 4 (11), e7924. Edited by J. Lauwereyns. Oxford University Press. https://doi.org/10.1371/journal.pone.0007924. Vogel, E.K., Luck, S.J., 2002. Delayed working memory consolidation during the attentional blink. Psychon. Bull. Rev. 9 (4), 739–743. Springer-Verlag. https://doi.org/10.3758/ BF03196329. Wadlinger, H.A., Isaacowitz, D.M., 2006. Positive mood broadens visual attention to positive stimuli. Motiv. Emot. 30 (1), 87–99. NIH Public Access. https://doi.org/10.1007/s11031006-9021-1. Wang, L., Kennedy, B.L., Most, S.B., 2012. When emotion blinds: a spatiotemporal competition account of emotion-induced blindness. Front. Psychol. 3, 438. Frontiers Media SA. https://doi.org/10.3389/fpsyg.2012.00438. Willems, C., Martens, S., 2016. Time to see the bigger picture: individual differences in the attentional blink. Psychon. Bull. Rev. 23 (5), 1289–1299. Springer US. https://doi.org/10. 3758/s13423-015-0977-2. Wright, R.D., Ward, L.M., 2008. Orienting of Attention. Oxford University Press, New York, USA. Wyble, B., Bowman, H., Nieuwenstein, M., 2009. The attentional blink provides episodic distinctiveness: sparing at a cost. J. Exp. Psychol. Hum. Percept. Perform. 35 (3), 787–807. https://doi.org/10.1037/a0013902.
ARTICLE IN PRESS References
Yerys, B.E., Ruiz, E., Strang, J., Sokoloff, J., Kenworthy, L., Vaidya, C.J., 2013. Modulation of attentional blink with emotional faces in typical development and in autism spectrum disorders. J. Child Psychol. Psychiatry 54 (6), 636–643. Blackwell Publishing Ltd. https:// doi.org/10.1111/jcpp.12013. Yokoyama, T., Padmala, S., Pessoa, L., 2015. Reward learning and negative emotion during rapid attentional competition. Front. Psych. 6, 269. Frontiers Media SA. https://doi.org/10. 3389/fpsyg.2015.00269. Young, A.W., Rowland, D., Calder, A.J., Etcoff, N.L., Seth, A., Perrett, D.I., 1997. Facial expression megamix: tests of dimensional and category accounts of emotion recognition. Cognition 63 (3), 271–313. https://doi.org/10.1016/S0010-0277(97)00003-6.
309