Simultaneous verbal and affective laterality effects

Simultaneous verbal and affective laterality effects

Pergamon 002%3932(94)E002&0 SIMULTANEOUS VERBAL AND AFFECTIVE EFFECTS LATERALITY M. B. BULMAN-FLEMING* and M. P. BRYDEN University (Rewiwd of Wa...

920KB Sizes 0 Downloads 48 Views

Pergamon

002%3932(94)E002&0

SIMULTANEOUS

VERBAL AND AFFECTIVE EFFECTS

LATERALITY

M. B. BULMAN-FLEMING* and M. P. BRYDEN University (Rewiwd

of Waterloo, 1 h’oremhur

Waterloo,

Ontario,

1993: accepted 3 I

Canada

Janunry

1994)

Abstract-By analyzing the error scores of normal participants asked to identify a specific word spoken in a specific tone of voice (for example, the word “tower” spoken in a happy tone of voice), we have been able to demonstrate concurrent verbal and affective cerebral laterality effects in a dichotic listening task. The targets comprised the I6 possible combinations of four two-syllable words spoken in four different tones of voice. There were 128 participants equally divided between left- and righthanders. with equal numbers ofeach sex within each handedness group. Each participant responded to 144 trials on the dichotic task, and filled in the 32-item Waterloo Handedness Questionnaire. Analysis of false positive responses on the dichotic task (responding “yes” when only the verbal or only the affective component of the target was present, or when both components were present but were at opposite ears) indicated that significantly more errors were made when the verbal aspect of the target appeared at the right ear (left hemisphere) and the emotional aspect was at the left ear (right hemisphere) than when the reverse was the case. A single task has generated both effects. so that differences in participants’ strategies or the way in which attention is biased cannot account for the results. While the majority ofparticipants showed a right-ear advantage for verbal material and a leftear advantage for nonverbal material, these two effects were not correlated, suggesting that independent mechanisms probably underly the establishment of verbal and affective processing. We found no significant sex or handedness effects, though left-handers were much more variable than were right-handers. There were no significant correlations between degree of handedness as measured on the handedness questionnaire and extent of lateraliration of verbal or affective processing on the dichotic task. We believe that this general technique may be able to provide information as to the nature and extent of interhemispheric integration of informatlon. and is easily adaptable to other modalities. thus holds great promise for future research.

INTRODUCTION THE investigation of laterality effects in normal participants has consistently shown a left hemisphere superiority for the perception and processing of verbal material and a right hemisphere superiority for the processing of nonverbal and emotional, or affective, material [ 1, 171. However, relatively few studies have assessed both left and right hemisphere function in the same individuals, and even fewer have employed comparable tasks to assess the functions of the two hemispheres. Thus, while the usual interpretation of such laterality effects is in terms of the manner in which different functions are represented in the two hemispheres, other positions remain tenable. For example, KINSBOURNE [lo] has proposed an attentional interpretation, arguing that the left hemisphere is preferentially activated

*Address all correspondence Waterloo, Waterloo. Ontario,

to: Dr M. Barbara Canada N2L 3G I.

Bulman-Fleming,

781

Department

of Psychology.

University

of

when the participant is expecting to carry out a verbal task, while the right hemisphere is activated when the participant anticipates a nonverbal or emotional task. Several studies have attempted to refute Kinsbourne’s position by asking participants to perform both verbal and nonverbal tasks simultaneously. Our laboratory has. for a number of years, used dichotic listening procedures to perform such experiments. In dichotic listening, participants hear different stimuli simultaneously at the left and the right ears and then indicate what they heard. With verbal material, such as words or nonsense syllables, a right ear advantage (REA) is routinely obtained [3]. A large REA for verbal material is a good predictor of the side of cerebral speech lateralization 17,201. In contrast, a left ear advantage (LEA) is obtained with musical passages, tonal sequences or emotional tones of voice [3]. To test the Kinsbournc [IO] hypothesis. LEY and BRYIXN [Il] had participants listen to sentences spoken in different affective tones and presented dichotically. After hearing each sentence, participants were asked to indicate the appropriate tone of voice by circling the appropriate alternative, and the verbal content by selecting several of the words in the sentence from a set of semantically or phonologically similar foils. They found an REA for the verbal component of the task. and an LEA for the emotional component of the task. Although LEY and BRYDEN [l l] found no effect of the order in which the two tasks were performed, this remains essentially a dual-task experiment, in which one must first deal with the verbal aspect of the task and then with the emotional aspect (or vice versa). Furthermore LI;Y and BRYDEN [l 11 did not analyze performance on a trial-by-trial basis, so we do not know how performance on one task affected performance on the other task. Thus, it is possible that participants engaged in quite different processes to perform the two parts of the task. A rather similar procedure was employed by GOOIXLASS and CALDERON [S]. They required participants to listen to a spoken sequence of digits in one ear while hearing a three-note sequence of tones played on a piano in the other ear. Subjects were then instructed to report both sequences in a predetermined order. Subjects were more accurate in reporting the digits when they had been presented to the right ear and more accurate in reporting the tonal sequences when they had been presented to the left ear. In one condition, analogous to that used in the present study, a trained singer sang the digits in dilferent melodic sequences: the participants, all music students, heard two competing sung sequences, one at each ear, and were required to sing the responses. Again, accuracy was greater on the right ear for identifying the entire 3-digit sequence correctly and on the left ear for singing the 3-note musical sequence correctly. While this latter condition does provide evidence against KINSBOURNE'S[IO] attentional hypothesis, it requires that the participants be able to perform a singing task, and the scoring is complex. Furthermore, the serial nature of the response opens the task to order effects. as was the case in the LEY and BRYDEN [ll] study. Several studies have demonstrated that one can obtain different laterality effects in dichotic listening with the same material, depending on the instructions. For example, SPELLA(.Yand BLUMSTEIN 1161found an REA for vowels presented in the context of other verbal material, but an LEA for the same material when it appeared in a nonverbal context. In a more recent study, BRYDEN and MAC.RAE [4] presented pairs of words dichotically. with the two members of the pair also differing in affective tone. When participants were required to signal the presence or absence of a particular word target, performance was better on the right ear than on the left. Conversely, when participants were asked to indicate the presence of any word spoken in a specific affective tone, they were more accurate on the left ear than on the right. Such a finding clearly indicates that

VERBAL ANU AFFKTIVE LATEKALITY

789

dichotic laterality effects are determined by the nature of the processing carried out, rather than by the acoustic properties of the stimuli. However, an interpretation along the lines offered by KINSBOURNE [lo] remains tenable. The present study employed the same stimulus material as used by BRYDEN and MACRAE [4], but altered the instructions to require participants to respond to specific emotion/word combinations. By requiring participants to make a single response to a specific token, the possibility of their employing different strategies to deal with the verbal and emotional aspects of the task was eliminated. Furthermore, since the bulk of neuropsychological evidence indicates that left-handed participants are less consistently lateralized than are right-handers Cl], we tested both righthanded and left-handed participants. METHOD

AND PROCEDURE

The stimulus material for this study was identical to that employed by BKYULNand MACRAE [4]. Four different two-syllable words (bower, dower, tower and power) were employed. Each word was spoken by a male speaker in four different affective tones (happy, angry, sad and neutral), providing a total of 16 different tokens. When appropriate tokens had been selected, each token was digitized on a modified PDP-1 l/40 computer, edited to a common duration of 500 msec, equalized in intensity, and stored. Each items was then paired dichotically with every other item thaf d@red in hot/z qfffctire tone and oerhal content, to produce 144 different stimulus pairs with aligned onset times. These pairs were recorded on an audio cassette in a random sequence for presentation through earphones at an average intensity of 75 dB. There was a 3 set interstimulus interval between the presentation of consecutive stimulus pairs, and a IO set interval after each group of 18 trials. Prowdure In the BRYUEN and MACRAE [4] study, participants were given either a target word or a target emotion and asked to signal when it was present. In the present study, participants were given a specific token (e.g. the word “bower” spoken in an angry tone of voice) as their target, and instructed to circle “yes” on a sheet of paper on each trial that their target was present and to circle “no” if it was not. Each participant was tested individually in a sound-proof room. In each dichotic session, the participant was first told what a dichotic listening test consisted of, what his or her target item was to he, and then heard each of the 16 word/affect combinations once, presented binaurally. No response was required at this time. The experimenter then stopped the tape recorder and asked the participants if they had identified their target item. If a participant had not heard his or her target, the 16 binaurally-presented items were played again. In very few cases was this necessary. The experimenter then started the tape recorder again. and left the participant to respond to the 144 dichotic trials. For all combinations of target, sex and handedness, half of the participants were tested with the earphones in one orientation, and half with the reverse orientation, in order to control for minor differences between the two channels. Any given target combination appeared on the left ear on nine trials and on the right ear on nine trials; the remannng 126 trials can he grouped into four types (Table I ): (I) Those trials on which a nontarget word spoken in the target affect was presented to one ear concurrent with the target word spoken in a nontarget affect being presented to the other ear. Thus, both components of the target were present, hut not paired in the same token. For example, if the targets were “bower, apokcn in a happy tone of voice”, then “happy tower” appeared at the right ear concurrent with “sad bower” at the left ear on one trial of this type. There were nine trials in which the target affect was at the left ear and the target word at the right ear, and nine trials in which the reverse was the case. (2) Trials in which only the target word was present at either the left or the right car. There were I8 such trials in which the target word was at the left ear. and I8 trials in which it was at the right ear. Thus, on such trials the target affect was not present at either ear. (3) Trials in which only the target affect was present. There were 18 such trials in which the target affect was at the left ear, and 18 trials in which it was at the right ear. (4) Trials in which neither the target word nor the target affect was present. There were 36 such trials.

One hundred and twenty-eight young adult participants were tested. Thirty-two males and 32 females were right-handed by self-profession and by the Waterloo Handedness Questionnaire 1171, and 32 of each sex were lefthanded by the same criteria. Two men and two women from each handedness group were assigned each one ofthe I6 dilfercnt tokens as their target item. Thus, eight people were tested with each one of the specific targets. Table 2 shows the performance of the participants on the Waterloo Handedness Questionnaire. This Table also showjs the mean performance on the 12 questions loading most heavily on STEENHUIS and BKYDEN‘S1171 skilled handedness factor, and the eight questions loading most heavily on their unskilled handedness factor. For ease of

M. B. BLLMAN-kLt.MIZG

790

Table

and M. P. BKYI)~I\.

I. Stimulus pairings employed Target alTect present Left ca, Right car

Target word Present

Left car Right eat

Target word Absent Numbers

Target affect abent

9 Y

Y Y

IX IX

IX

18

36

in bold type mdicatc trial5 on which complete target item is present

interpretation. all questionnaire items were scored from -2 (always USCleft hand) to +2 (always use right hand). and Tahlc 2 shows the item means. A.\ can he seen, the left-handed participants are less strongly handed than are the right-handers. I-Tests comparing the right-handers‘ scores to the Icft-banders’ scorch with the sign changed are highly slgmlicant for the total score [f (84)~8.46, PiO.OOl] and for both skilled [r (73)~ 7.90. P~O.0011 and unskilled [I (I 13)~5.24. P
Left-handcrs 11=64

1.2? (_+0.31)

-0.34

(iO.75)

1.68(10.24)

-0.x3

(*0.88)

0.80 (kO.4Y)

-0.25

(kO.69)

RESULTS Overall, alarms on alarms as Earphone

participants correctly detected their target on 66% of the trials. and gave false 13% of the trials in which the specific target was not present. The incidence of false a function of trial type pooled over sex and handedness is shown in Table 3. orientation had no effect in any of the analyses. Table 3. Accuracy and false alarms Probability Type of response Correct responses False alarms Affect right. uord Icft Allicct Icft, word right AlTect alone present Word alone present Neither fiord nor nl%xt present *The prohabillty target item.

(yes)*

Left ear

Right ear

0.65

0.66 0.27 0.37

0.18 0.09

0.14 0.12 0.03

ufa subject‘s indicating that he or she had heard the correct

VERBAL

AND AFFECTIVE

LATEKAL.ITY

791

One can ask several questions of these data. First of all, correct responses can be examined for laterality effects. More illuminating, however, are analyses of the three different trial types mentioned above (types 1, 2 and 3, in the Procedures section) on which false positive responses (false alarms) were given when only one component of the target was present or when both were, but at different ears. On these trials, some information concerning the target was detected, and led to a positive response. Such trials thus provided us with information about when and how the affective and verbal components of the target were detected. Furthermore, one can also ask whether certain words or affects or word/affect combinations were more easily detected than were others. Finally, by computing separate measures of the extent of verbal and of affective lateralization for each participant, one can address questions relating to the independence of the underlying mechanisms. Correct responses and false alarms were analyzed using several analyses of variance, each addressed to a specific question concerning the data. The alpha level was set at 0.01. Lutrrulity,

sex und handedness

czfects

Analyses of variance, using handedness and sex of participant as between-subjects variables, and ear of presentation as a within-subject variable, indicated a highly significant effect for false alarms made when only the affective component was present [F (1, 124)= 10.78, PO.O5). “Real” false alarms (when neither component of the target was present at either ear) were analyzed with the two between-subjects factors only, because it was not possible to ascertain where the participants thought the target had appeared when, in fact, no component of the target was present. Here again, there were no significant effects (P> 0.05). These data were also analyzed for laterality effects by determining the number of participants who made more false alarms at the right ear than they did at the left ear (and those for whom the reverse was the case) for word only, affect only, and blend errors and then carrying out binomial tests on these frequency counts. The data are summarized in Table 4 and are presented for left- and right-handed participants separately, as well as for the two groups pooled. The binomial tests were conducted on the pooled data, ignoring those participants whose scores were equal for the two ears. The results of the three binomial tests were significant, although the effects were marginal for word only, stronger for affect only and strongest for the blend errors. This last effect was large because of the large discrepancy between the number of right-handed participants making more affect left/word right errors than affect right/word left errors.

192

M. B.

BI;LMA\-FLI.MING

and

M. I’.

BKYIXX

Table 4. F’requencies of left- and right-handcrs producing equal or uneqtul alarms at the left and the right ears* Type of false aliirm

Right-handcrs

Left-handcra

numbers of false

Totals

Word only false alarms WR>WL WL=WL WR
21 30 13

21 31 12

42 hl 25

Alkt only falx AR>AL AR=AL AR
I8 16 30

I7 20 27

35 36 51

42 IO I’

32 IO 32

74 20 34

Binomial test

5 =

P-

I .Y5 0.05

alarm\

“Blend” false alarm.s AL,‘WR >AR,‘WL AL.WR -AR.WL AL’WR
;=2.19 P < 0.05

z13.75 P < 0.00 I

*Abbreviations: WL word false alarms at left car; WRpword f&c alarms at right car: AL-all’ect f&e alarm at left ear: AR-affect false alarms at right ear; AL:WR~pmtarget afkct at left ear and target uord at right: AR:WL -target affect at right ear and target word at left. Entries 1n boldface are the figures used for the binomial tests.

One question that these analyses fail to address is whether there were differences in frequency among the three types of false positive responses. As can be seen from Table 3. participants were more likely to make false positive responses when the affect only was present than when only the word appeared, and also more likely to respond “yes” when both components of their target were present, but on different ears, than when only a single component of the target was present. These results were statistically significant when the data were cast into an ANOVA with two within-subjects variables (ear of presentation and type of false alarm) and with sex and handedness pooled [F (2, 126) = 161, P < 0.001 for type of false alarm].

Further analysts of correct responses and false alarms were carried out in order to investigate possible effects due to specific aspects of the target tokens, but with data pooled across sex and handcdncss in view of the failure to find significant effects ofthesc variables in the above analyses. Again, separate analyses of variance were performed on correct responses and each type of f&c alarm, this time with affect and word as between-subjects variables and. as above, with ear of presentation as the within-subjects variable. Table 5 contains the means and S.D.‘s of the proportion of correct responses and false alarms at each ear, pooled across sex and handedness: for ease of interpretation. Table 5 shows the data for the four differcnt words collapsed across affects, and for the four different affects collapsed across words. Analysis of “word only” errors revealed, in addition to the expected main effect of ear [F(l, 112)= 19.69. P
VERBAL

AND AFFECTIVE

----

$ 5

--^^

bmmm To--

Egg;

cddd

6660

_1--1-

+I +I

+I +I

---SZFQ me

26666 m +I ‘3----

LATEKALI TY

+I +I +I

+I

---_

+I fl

+I

KG-c= 03-6663 +I

----

+I

+I

+I

193

794

M.

B.

BLLMAY-FLLMING

and

M.

P.

BKYIXN

identified their target word but mistook the emotion paired with that word as their target emotion when it was not, this indicates that happy and sad tokens were actually more easily detected than were angry or neutral ones. The word by affect interaction, indicating small differences in the extent of the main effect of affect across different words, was difficult to interpret, and is probably only indicative of the idiosyncratic nature of the tokens. The ear by atfect interaction was a result of the large difference between false alarms to the right and left cars (more on the right ear) for participants with a neutral tone as their target as compared to very much smaller right ear advantages for the other target tones of voice. This makes intuitive sense, as participants instructed to detect a neutral target do not have the same set for affect as do those who are instructed to detect happy, sad or angry targets. “Affect only” false alarms occurred when participants correctly indicated that their target emotion was present, but incorrectly indicated that it was paired with their target word. ANOVA results indicated main effects of affect [F (3, 112)=X.37, P
yumssiny

The data from this study can also provide a test of the hypothesis that the processes involved in the determination of the lateralization of verbal and affective functions are independent, as BRYDEN [2] and JASON ct trl. [9] have suggested. A measure of the extent of language lateralization was constructed by obtaining the natural logarithm of the ratio of the word only errors at the right ear to those at the left car. and a similar measure was constructed for affect only errors. Unity was added to both the nurnerator and denominator to eliminate zero divisors. Thus: (word only errors right ear + 1) fn ~_~~ (word only errors left ear-t 1) Since this laterality

index is indeterminate

if no errors

are made,

only data from the 37

VEKHAL

AN,,

AFFECTIVE

LATEKALITY

195

right-handed and 35 left-handed participants who made word only and affect only errors were used in the following analyses. If the extent to which language is lateralized to the left hemisphere were positively correlated with the extent to which affective processing resides in the right hemisphere, then one would expect a negative correlation between these two measures. Conversely, the notion of “hemisphericity” (an individual preferring to use one hemisphere for all, or almost all, processing) would imply a positive correlation. On the other hand, if neither of these two scenarios represents the true state of affairs, one would expect no correlation. Pearson correlation coefficients were 0.30 and -0.09, for right- and lefthanders, respectively (both P~0.05). These data therefore provide evidence for the independence of the lateralization of language and affective processing. Degree

of handedness

and dichotic

listening

In order to investigate the possibility of a relationship between scores on the Waterloo Handedness Questionnaire and performance on the dichotic listening task, the three handedness variables shown in Table 2 were derived for each of the participants meeting the above criteria. These handedness variables were correlated, separately for each handedness group, with the measures of degree of verbal and emotional lateralization mentioned above, as well as with a similar measure of the ratio of the two kinds of blend error. The only correlation that even approached statistical significance was that between the unskilled (Factor 2) handedness score and the degree of emotional lateralization in right-handers (r = - 0.34, P < 0.05); participants who were more strongly right-handed showed a weaker effect than the less strongly right-handed participants. The interpretation of this is not immediately obvious, and it may well be a result ofsampling error. considering the number of correlations performed.

DISCUSSION Consistently in all analyses, the data indicate that verbal component ofthe target produces significantly more false positive responses when presented to the right ear (left hemisphere) than to the left ear, and that the affective component of the target produces more false positives when presented to the left ear (right hemisphere) than to the right ear. Since there is only a single task, and the participants did not know what type of trial was going to occur next, their attention and expectation do not differ systematically with trial type. Thus, the study demonstrates, at the same time, both a superior sensitivity of the left hemisphere for verbal material and a right hemisphere superiority for the detection of the affective dimension of stimulus material, and clearly demonstrates that factors other than attentional biases [ 10, 121 must determine auditory laterality effects. It remains possible that participants were selectively attending to one component of the target item in preference to the other. This is supported by the observation that targets on which participants made many errors when the word alone was present were also targets on which they made relatively few errors when the affect alone was present. while those who made many errors when the affect alone was present made relatively few when the word alone was present [p = -0.95 (N= 16) for the correlation between total word only errors and total affect only errors by target]. However, this effect appears to be primarily due to the relative salience of word and affect in the specific tokens, since this correlation is clearly insignificant (r = -0.09, N= 128) when computed across participants rather than across targets. While

196

M. B. BLI.MA~-FLEMIM; and M. P. BKYLXN

some participants may have used selective attention strategies, it does not detract from the fact that both an REA for verbal material and an LEA for affect can be observed at the same time. The dissociation of verbal and affective effects was seen in the majority of participants, regardless of whether they made more affect errors or more word errors. Thus, attending primarily to the word does not simply enhance left hemisphere processing, nor does attending to the affect enhance right hemisphere processing, as would be expected from KINSBOURXE’S [IO] model. It is also noteworthy that the false alarm rates for those trials on which both components wcrc prcscnt but not paired is considerably higher than would be predicted from the false alarm rates for the individual components, indicating that there must be interhemispheric integration of information. For example, when the affect component is at the left ear and the word component at the right ear, the false alarm rate is 0.37. If participants were producing false alarms to a single component of the target, one would expect that the false alarm rate for the joint stimuli would be the sum of the probability of producing a false alarm when the affect was on the left ear and that for producing a false alarm when the word was on the right ear, minus the joint probability. that is, 0.18 + 0.12 ~ (0.18 x 0.12) = 0.28. The fact that the observed false alarm rate is considerably higher implies that there is some summation of information from the two cars. A comparable computation for the condition in which the affect is on the right and the word on the left gives a predicted value of0.22, which is closer to the observed value of 0.27. This suggests the possibility that interhemisphere integration is grcatcr when the information initially arrives at the appropriate hemisphere than when the reverse is the case. There exists in the literature on the lateralization of affect considerable disagreement as to whether the right hemisphere subserves all emotions 161, or whether there is a hemisphere by \;alence interaction. with the left hemisphere being more involved in positive affect [ 141. The present data, like those of BRYIXN and MACRAE [4], provide little support for the latter notion. All Sects showed an LEA, and those for positive (happy) and negative (sad, angry) emotions wcrc of similar magnitude. It is important to realize, however, that the perception of emotional information may well involve different neural substrates from those concerned with emotional expression and experience [S]. and that participants may use different cognitive strategies for the detection of emotions of differing valence. This latter point was suggested by REUTER-LOR~YZ rt al. [ 151 as a possible explanation of their finding of a valence by hcmisphcre interaction in reaction times to the detection of happy and sad faces. As in most other studies of the lateralization of affect, we employed only a single speaker and a single digitized token to represent each word -affect combination. One might therefore argue that these data arc not representative of speakers in general. While we plan to investigate “speaker” as a variable, by producing parallel forms of this test with different speakers, we arc rcassurcd by the observation that the effects of affect in the prcscnt study are consistent across words. It is notable that there were no significant ditferences between left-handers and righthanders in any of the performance measures. In most previous research. left-handers have been shown to have a reduced REA for dichotically-presented verbal material [-?I. In the few studies involving nonverbal material. left-handers again show a reduced laterality effect (e.g. Ref. 1131). While handedness cffccts failed to reach significance in the prcscnt study, it is worth pointing out that both the REA for the verbal component and the LEA for the affective component were reduced in our left-handers as compared with our right-handers. The increased variance found in left-handers prevcntcd any of the comparisons from reaching

VEKBALANU AFFECTIVFLATFRALlTy

797

statistical significance. This only serves to emphasize how small the neuropsychological differences are between left-handers and right-handers. We believe that this technique of having participants respond to a joint target (that is, one comprising a component thought to be processed more efficiently by the right hemisphere as well as one believed to be better processed by the left hemisphere) and then analyzing the false positive responses, holds great promise for future research. The procedure can be generalized to other tasks, both auditory and visual, and can possibly provide insights into the nature of the integration of information between the two hemispheres. A~k,lo~vledy~menrs-This research was supported by a grant from the Natural Sciences and Engineering Research Council of Canada to M.P.B. Portions of these data were presented at the 1993 meeting of the International Neuropsychological Society and at the 1993 joint meeting of the Canadian Society for Brain, Behaviour, and Cognitive Science and the Experimental Psychology Society. The authors should like to thank Todd Mondor for discussions that led to this study, Gina Grimshaw and G. E. MacKinnon for valuable comments on the manuscript, and Tracy Cocivera and Karen Clarkson for assistance in testing the participants and analyzing the data. We also thank two anonymous reviewers, both of whom provided very helpful comments on the first version of this manuscript.

REFERENCES Functiona/ Asymmrtry in the Intnct Brain. Academic Press, New York, 1982. I. BRYUEN, M. P. Lateralitg: Speciulizution. F. LEPOKE, M. PTIU) and H. H. JASP~K 2. BKYIIEN, M. P. In Tile Nature qf Cornplrrnentarr (Editors), pp. 463 469. Alan Liss, New York, 1986. of affect in normal subjects. In Handbook cd Dichotic, 3. BKYUEN. M. P. Dichotic studies of the lateralization Listening, K. HUCUAHL (Editor), pp. 144. John Wiley, Chichester, 1988. 4. BKYUEN. M. P. and MACRAE, L. Dichotic laterality effects obtained with emotional words. NeuropvJc,hiclt. Neuropsyckol. Behar. Neural. 1, 171~176. 1988. 5. DAVIDSOX, R. J. Cerebral asymmetry and emotion: Conceptual and methodological conundrums. CoVIXH. M. Hemisphere specialization and the perception of emotion: Evidence from right-handers and from inverted and non-inverted left-handera. ,Yru,op,s~c’ho/o~~ir, 21, 687 692, 1983. 16. SPELLACY.F. and BLLMSTFIX,S. The influence oflanguage set on ear prcfcrence in phoneme recognition. Cortru 6, 430-439. 1970. 17. SPKINGEK, S. and DEUTSCH, G. Left Bruin, Rkqhf Bruin, 3rd Edn. W. H. FK~EMAY, San Francisco, 1990. 18. ST~~NHUIS, R. E. and BKYUEN. M. P. Different dimensions of hand preference that relate to skilled and unskilled activities. Cm-fez 25, 289 304, 1989. I9 ZATOKKF, R. J. Perceptual asymmetry on the dichotic fused words test and cerebral speech lateraliration determined by the carotid sodium amytal test. /Veurop.s~~holoyia 27, 1207 1219, 1989.