Journal ofAfjCectic,e Disorders, 30 (1994) 61-71 0 1994 Elsevier Science B.V. All rights reserved
61 0165-0327/94/$07.00
JAD 01055
The classification a computer-based Issy Pilowsky Department
of Psychiati?:
of facial emotions: taxonomic approach * and Mary Katsikitis
Unkersity
of Adelaide, Adelaide,
South Australia,
Australia
(Received 3 June 1993) (Revision received I August 1993) (Accepted 1X August 1993)
Summary This study investigated whether the six ‘fundamental’ expressions of emotion each have configurational properties which would result in their being grouped into classes by a classification program. Twenty-three actors posed the six ‘fundamental’ emotions of happiness, surprise, fear, disgust, anger, sadness and a neutral expression. Still images of these videotaped expressions were digitised and distance measures between facial landmark points were obtained. These measures were subjected to a numerical taxonomy procedure which generated five classes. Class 1 contained almost 70% of the happiness expressions. In Class 2 the majority of expressions were of surprise. Each of classes three, four and five consisted of mixtures of emotions. Class 5 however, was distinguished from all other classes by the complete absence of happiness expressions. The typical facial appearance of members of each class is described (based on distance measures). These findings support the salience of happiness among emotional expressions and may have implications for our understanding of the brain’s function in the early development of the human infant as a social organism.
Key words:
Facial
expression;
Classification;
Taxonomy
Introduction In recent years there has been a discernible increase of interest in facial expressions as a subject for psychiatric research. The purpose of such research into the ability of psychiatric patients to transmit (encode) and recognise (decode) facial emotions has obvious relevance to
* Corresponding
SSDI
author.
0165.0327(93)EOO78-9
the appraisal of their affective state as well as their capacity for interpersonal relationships. The history of modern ‘scientific’ facial expression research is usually considered to have begun with the publication of Darwin’s ‘The Expression of the Emotions in Man and Animals’ in 1872, in which he proposed the existence of a number of facial expressions which were universal, the result of an evolutionary process and therefore evidence for the notion of a ‘common stock’. In exploring the field, Darwin faced the same problem as all
62
other researchers: how to capture the facial expressions he wished to study. In his case he used the ‘new’ technology of photography to obtain expressions posed by actors, and one series of a child crying which had fortuitously been taken by a professional photographer in the course of a sitting for a portrait. Photographs were also used to study facial expression in psychiatric illness by Dr. Hugh Diamond, who presented an account of his work to the Royal Society in 1856 (Gilman, 1976), entitled ‘On the application of photography to the physiognomic and mental phenomena of insanity’. Since those early days, developments in technology have continued to influence facial expression research. The principal advances have been the availability of cinematography, video recording, video printers (to produce a hard copy), and, at the same time, microcomputers which are able to process the large amounts of data which can be generated from facial expressions. Pre-eminent among those who have developed facial expression measurement systems to produce quantitative data suitable for further analysis, are the University of California, San Francisco psychological researchers headed by Paul Ekman. Their approach has been an ‘anatomical’ one which involves the ability to recognise and score all muscle actions which underly as facial movements. While this method has many advantages by virtue of its comprehensiveness, it has some logistic drawbacks. Of these the most salient are (1) the time consuming nature of the scoring process, and (2) the need for scorers to undergo a thorough training. More recently, Thornton and Pilowsky (1982) reported the use of a mathematical model of the face which could be used as a basis for quantifying facial movements. This model was further refined and developed to produce a more aesthetic outline of the face as well as twelve measures representing the distances between readily locateable facial landmarks such as the outer canthus of the eye and the corner of the mouth (Pilowsky et al., 1985, 1986). A more detailed description of the procedure for obtaining facial measures is provided below. The advantages of this approach are (1) information is retrieved from faces relatively rapidly using a digitising
board; (2) very little training is required to learn how to digitise the photographs of faces taken from videotapes or other sources; (3) the computer programme produces (a> an outline drawing of the face which comprises the essential information required for making judgements about facial expression, while conferring complete anonymity on the subject with no clues as to age, sex, social class or culture and (b) 12 measures of the distances between key landmarks. The validity of the facial outlines has been demonstrated (Katsikitis et al., 1990) and the system has been applied to the investigation of facial expressions in Parkinson’s Disease and Depression (Katsikitis and Pilowsky, 1988, 1991). Thus far, the Facial Expression Measurement (FAC.E.M.1 programme has concentrated on the smiling expression, i.e., the emotion of joy or happiness. This seemed a natural starting point for a number of reasons, viz. it is the commonest expression to be seen in everyday life; it is the most reasonable expression to evoke from subjects (at least from an ethical perspective); and it is an important expression in clinical psychiatry, since its presence, or absence and quality, is often used as an indicator of depression, although rarely quantified. In light of these considerations it seemed reasonable to consider smiling as a phenomenon worthy of study in its own right at basic and clinical levels, in the expectation that such a focus should produce useful information about affect, (in particular depression) and its display. This study seeks to continue the validation process by extending the application of the FAC.E.M. programme beyond smiling to explore the relationship of the twelve measures to all the emotional expressions which Ekman et al. (1969) regard to be ‘fundamental’, universal and recognised pan-culturally. These are the expressions of happiness, surprise, fear, disgust, anger and sadness. Their study will be discussed in greater detail below, in relation to the findings of the present study. Method Subjects The expressions analysed posed by 20 first-year drama
in this study were students and three
63
psychology graduates. All subjects were told that the study was concerned with the classification of emotion based on the facial expression of 6 ‘fundamental’ emotions. Procedure Subjects were seen individually and asked to adopt a comfortable position facing a portable video camera on a tripod. The camera was focused on the face and the ‘actors’ were asked to pose each of the following expressions in turn: happiness, surprise, fear, disgust, anger, sadness and also a neutral expression, with the face in repose. No instructions were given as to the facial appearance of any emotion. All expressions were video-recorded. Two independant judges, with no training in the judgement of facial expression, observed the videotaped expressions on a monitor and selected the peak of an expression of each emotion for each actor by pressing the ‘freeze-frame’ button. A black and white hard copy was made of the agreed expression from the still image of the face on the screen. A total of 161 expressions (23 actors X 7 expressions) were thus obtained. The computer programme for the quantification of facial expression (FAC.E.M.) has previously been described and is based on a mathematical model of the face (Thornton and Pilowsky, 1982; Pilowsky et al., 1985, 1986; Katsikitis et al., 1990). The quantification of a facial expression involves the digitisation of 62 specified points directly from the still image of the face. This coordinate information is fed into the model and a line drawing of the mouth, nose, eyes, eyebrows and facial outline is then produced. In this form the face is anonymous and provides only the essential information required for a judgement about expression. In addition, twelve scores are generated which represent the distances between facial landmarks. The distance measures refer to numerical transformations of these distances, and are expressed as a percentage of an existing range (unique for each measure), reflecting the maximal excursion of those points on the face (Katsikitis et al., 1990). The twelve distance measures are labelled as follows: 1. End-Lip Measure, 2. Mouth-Width Measure, 3. Mouth-Opening Measure, 4. Mid-Top-Lip
Fig.
1.Vertical facial measures.
Measure, 5. Mid-Lower-Lip Measure, 6. Top-Lip Thickness Measure, 7. Lower-Lip Thickness Measure, 8. Eye-Opening Measure, 9. TopEyelid/ Iris Intersect Measure, 10. LowerEyelid/ Iris Intersect Measure, 11. Inner-Eyebrow Separation Measure, 12. Mid-Eyebrow Measure. All horizontal measures are divided by the distance between the outer canthi of the eyes and the vertical measures by the length of the nose, in order to produce standardised measures which allow for face size or movement. Each facial expression is described by these 12 scores (Figs. 1 and 2). The facial measures were analysed using the numerical taxonomy programme, SNOB (Wallace and Boulton, 1968; Wallace, 1986; Pilowsky et al.,
‘c-2-f
Fig. 2. Horizontal
facial measures.
64
1969). This programme seeks to establish classes that are the best fit for a particular set of data. To achieve this, an information measure is utilised which calculates the ‘effort’ required to achieve the classes i.e., the effort required to classify the data into classes representing the best fit for that data. The shortest message length is used as the ‘measure of goodness’ and is an estimate of the effort required to achieve these classes. Unlike clustering procedures which depend on the user’s judgement to decide on the number of classes present, SNOB provides its own classification based on the likeness of the attributes describing the members of classes. The significant attributes discriminating each class are also calculated in terms of the difference between the mean class value for a particular distance measure and the population mean. Results Optimal minimisation of the information measure was achieved when 160 expressions were grouped into five main classes (a sixth class contained only one member, i.e., one expression). The distribution of posed expressions across classes is presented in Fig. 3 (Figs. 4 to 8 display the actual line drawing representations of the expressions within each class). From Table 1, it can be seen that 16 (89%) of the 18 members of Class 1 were expressions of happiness. In Class 2, 8 (61%) of the 13 members were posing the surprise expression. In Class 3 (n = 13) all expressions were represented, but none by more than 3 faces. Class 4 (n = 59) was characterised by predominantly negative emotions with 15 (25%) fear, 10 (17%) surprise, 9 (15%) neutral, 8 (14%) dis-
TABLE
Class 1 (N=18)
Clns 2 (N=l.zl
Class 4 (K=S9)
Class 3 IN=13
Fig. 3. Frequency
of emotions
Class 5 (N=57)
within classes.
gust, 8 (14%) anger, 7 (12%) sadness and only 2 (4%) happiness expressions. Class 5 (n = 57) was distinguished from all the other classes by a total absence of the happiness expression, with 97% of the class consisting of the negative emotions of sadness (23%), neutral (23%), anger (21%), disgust (19%), fear (ll%), and surprise (4%) expressions. Table 2 presents the means of the significant measures characterising each class. With regard to Class 1, End-Lip Measure, Mouth-Width Measure, Mouth-Opening Measure, Mid-Top-Lip Measure and Top-Lip Thickness Measure were found to be significant contributors to the class membership. The facial expression for members of this class is characterised by elevated corners of the mouth, an elevated and ‘thicker’ top lip and a mouth open and wide. This finding is congruent with mouth measures found in previous work using the distance measures to distinguish smiles from other expressions (Pilowsky et al., 1986; Katsikitis et al., 1990).
1
Frequency
of emotions
within
classes
Class
H
1 (n = 18) 2 (n = 13)
16 2
2 8
_ 1
1
3 (n = 13) 4 (n = 59) 5 (n = 57)
3 2 _
1 10 2
1 15 6
2 8 11
Note: H = happiness;
Su = surprise;
su
D
F
F = fear; D = disgust;
A = anger;
Sa = sadness;
A
Sa
N
_
_ 1
_ _
3 8 12
2 7 13
1 Y 13
N = neutral.
65
Fig. 4. Line
drawing
representations Class I
of the
expressions
in
In the case of Class 2 (surprise) the significant attributes were the Mouth-Opening Measure and the Mid-Lower Lip Measure. The combination of these measures represent a characteristic feature of surprise, i.e. an open mouth as the jaw drops (Ekman and Friesen, 1975). This expression frequently blends with happiness (Ekman and Friesen, 1975) as is the case in Classes 1 and 2. Class 3 is characterised by the relatively short distances of End-Lip Measure, Mid-Top-Lip Measure, Mid-Lower-Lip Measure, Eye-Opening Measure and Mid-Eyebrow Measure. The facial expression represented by these facial measures may be described as follows; raised upper and lower lips, raised outer corners of the mouth, narrow palpebral fissures and lowered eyebrows. Ekman and Friesen (197.5) describe a very similar profile in the description of the blend of the two emotions of anger and disgust, which are also two of the prominent emotions found in Class 3. Happiness, which is also represented in this class is found commonly in blends with anger and contempt. However, it is difficult to label this class of 13 faces due to the very limited representation of each expression.
Fig. 5. Line
drawing
representations Class 2.
of the
expressions
in
Fig. 6. Line
Tz :-) 4
!I
drawing
representations Class 3.
of the
expressions
in
66
I
25
52
I
Fig. 7. Line drawing
representations
I
of the expressions
in Class 4.
I
67
%”
LTJ
I
21
8
25
-c :Iz;i --
-4!L .’
27
28
0
29
32
-1
47
0
\
\
IE
Fig. 8. Line drawing
representations
36
of the expressions in Class 5.
68
Class 4 consists of the simultaneous occurrence of fear with sadness, anger or disgust as reported by Ekman and Friesen (1975). The fear/surprise association is the dominant blend in this class and End-Lip Measure, Mouth-Opening Measure, Mid-Top-Lip Measure and MidLower-Lip Measure were the significant facial measures. The facial display of this group consists of the outer corners of the mouth being drawn inwards, a slightly open mouth and the lowering
TABLE Means
of the top and bottom lips. This finding accords with the evidence for the involvement of the mouth in the display of both fear and surprise expressions. The role of the upper facial area including the movement of the eyes and eyebrows was not evident here. Class 5 is characterised by the total absence of any faces presenting the happiness expression. It consists almost entirely (with the exception of 2 surprise faces) of the ‘negative’ emotions: fear,
2 (and standard
deviations)
of facial measures
Facial Measures
Popn mean
Class 1
End-Lip Measure
45.6 (11.6)
27.3 ***
Mouthwidth Measure
33.8 (15.2)
60.2 ***
MouthOpening Measure
36.2 (24.0)
58.7 *** (12.0)
90.9 *** (28.1)
Mid-Top-Lip Measure
45.0 (15.3)
31.5 ** (10.4)
50.1 (15.6)
Mid-Lower-Lip Measure
31.5 (9.3)
(5.2)
(9.7)
33.1 (7.5)
within
classes
Class 2
Class 3
54.6 (10.6)
30.9 ***
33.7 (13.9)
36.7 (8.9)
(9.1)
32.0 (20.4)
24.9 *** (6.7)
Class 4
55.1 *** (6.1) 29.8 (13.9)
Class 5
42.8 *** (3.8) 28.9 * (10.3)
30.6 * * * (13.6)
24.0 ***
57.6 * * * (10.8)
39.4 * * * (9.3) 25.2 *** (3.1)
51.4 ***
21.1 **
34.9 ***
(9.0)
(7.2)
(4.6)
(9.3)
Top-Lip Thickness Measure
24.3 (17.4)
45.6 *** (16.9)
24.9 (18.1)
23.6 (19.4)
20.3 (15.0)
21.3 (14.7)
Lower-Lip Thickness Measure
44.1 (15.1)
47.2 (15.0)
58.6 (18.3)
45.5 (14.5)
44.9 (15.0)
40.1 (12.X)
Eye-Opening Measure
46.4 (10.0)
43.6 (10.2)
55.2 (7.7)
Top-Eyelid/Iris Intersect Measure
76.0 (11.1)
80.9 (12.9)
Lower-Eyelid/Iris Intersect Measure
49.4 (11.6)
Inner-Eyebrow Separation Measure Mid-Eyebrow Measure Note:
35.1 ** (7.7)
51.2 (9.7)
42.8 ** (6.5)
75.4 (13.6)
76.5 (8.1)
76.7 (9.0)
75.2 (7.2)
55.2 (12.3)
46.6 (9.9)
49.0 (7.6)
50.5 (11.4)
47.4 (12.2)
83.9 (19.1)
79.1 (17.6)
95.6 (16.7)
71.2 (20.7)
80.9 (18.2)
88.3 (18.5)
28.2 (18.5)
30.4 (14.8)
46.7 (17.5)
(8.6)
32.9 (16.6)
24.X (15.9)
* sig 0.05, * * sig 0.01,
* * * sig < 0.001, denotes significance
1.4 ***
levels for difference
from the population
mean.
69
disgust, anger, sadness and ‘neutral’. It terised by significantly greater distance on End-Lip Measure and Mid-Top-Lip with shorter distances for Mouth-Width and Mid-Lower-Lip Measure than the class (Class 1). It is in essence, a ‘long’
is characmeasures Measure Measure happiness face.
Discussion The present study involved the classification of emotional expressions based on the distance measures between facial landmarks. Two main classes showing dominant membership of one expression viz the smile (Class 1) and surprise (Class 2) expression respectively, were differentiated. The third class was not as clearly characterised by one dominant expression but included faces representing all of the emotions. Class 4 was characterised by predominantly negative expressions while the striking feature of Class 5 was the complete absence of the happiness expression. This study provides support for the salience of happiness and the smile in the delineation of the emotion categories. It is well documented that smiling is the most easily recognised of all the fundamental expressions (Drag and Shaw, 1967; Ekman, 1978; Thompson and Meltzer, 1964; Wagner et al., 1986) even in a deteriorated state of reproduction (Wallbott, 1991). Indeed its absence may be as potent a signal as its presence, even if a specifically sad expression is not perceived. There is a sense in which smiles function as regulators of interpersonal relationships. They certainly play a striking role in early childhood, as evidenced so clearly in parents’ search for the first smile which seems to be responded to as a developmental milestone akin to the infant’s first steps. The early emergence of the smile in the infant and the evidence for the universality of smiling, suggest a special importance for this facial display. Our findings are also in keeping with those of Ekman et al’s. (1969) who reported more consistent findings across literate and preliterate cultures when the subjects were asked to recognise the happy emotion as compared to other so called ‘fundamental’ emotions. In considering these findings it may be useful to use a developmental analogy. Thus we can conceptualise the computer with its classification
programme as a newborn infant with its potential for recognising facial expressions (or configurations). This analogy suggests that the infant is likely to classify faces into three main groups viz those which are smiling, those which are not, and those which show a surprised expression. It may be reasonable to speculate that these internal templates are ‘hardwired’, while the ability to recognise or discriminate other expressions such as fear, disgust and anger require socialisation. This would be consonant with the findings of Field et al. (1982) who found that neonates could discriminate between happy, sad and surprised expressions. There may be a number of reasons why the other five expressions considered universal were not as clearly delineated as the happiness or surprise expressions. For example, some investigators have suggested that the analysis of spontaneous expressions may produce a different series of configurational classes than when posed expressions are used as was the case in this study (Hunt, 1941). Others argue that the knowledge of situational cues may also alter these findings (Knudsen and Muzekari, 1983; Russell and Fehr, 1987). However, the search for such explanations assumes that the existence of the ‘fundamental’ facial expressions has been demonstrated beyond doubt. The literature does not support such a conclusion. For example, although Ekman et al. (1969) have claimed on the basis of transcultural studies that the ‘fundamental’ expressions are recognised panculturally, an examination of their data suggests that only smiling shows a strong claim to being regarded a pan-cultural expression. Thus the data presented in their paper shows that the rates of recognition varied form 82% to 99% for happiness, 23% to 88% for fear, 19% to 82% for anger, 33% to 91% for surprise, 29% to 86% for disgust and 26% to 82% for sadness. Furthermore, the non-industrialised cultures had lower recognition scores than the American, Brazilian and Japanese samples. This study has deliberately adopted a purely numerical approach in which a computer has been used to apply a numerical taxonomic classification programme to data consisting only of relationships between facial landmarks presented as standardised distance measures. The result
70
was not six classes into which each of the posed ‘fundamental’ emotions were allocated. Instead the procedure yielded five classes with one being readily described as a homogeneous grouping viz. of actors presenting the facial expression of happiness, and a second being distinctive in the sense that such expressions were completely absent. The possibility remains that this lack of congruency with the emotions generally accepted as ‘fundamental’ reflects their derivation from work with adults capable of giving a verbal response to a stimulus. Our findings may better describe a preverbal stage of facial emotion classification. It would seem important to delineate and study such protoclassifications in order to improve our understanding of the facial emotions and their neurophysiological correlates. In this regard it is interesting that Bowers et al. (1991) have advanced the view that the right hemisphere may contain “a lexicon/ representation of facial emotions or at least the hardware for activating these representations” (p. 2603). Our findings would suggest that this lexicon may contain only two representations, i.e., a smiling face or a neutral expression, with the possibility that surprise is a third configuration. From a research point of view, it would seem justified to focus on smiling as the most significant facial expression in most social interactions. It should be acknowledged that this study has been predicated on the assumption that the so called ‘fundamental’ or universal expressions may not be as clearly established as is often assumed. Russell and Bullock (1986) propose that the discrete categories represent a taxonomy of emotion which is ‘fuzzy’ or unclear, resulting in the judgement of similar facial expressions as equally likely to appear in one category or another. In a series of experiments, Russell and Bullock found that judges had difficulty restricting some facial expressions to only one category. Similarly, proponents of the ‘prototype’ approach to the classification of the emotions advocate that classes are composed of prototypes, some of which describe a class of objects better than others (Shaver et al., 1987; Russell, 1991). This means that ‘gold standards’ do not exist at present, and cannot be used as a basis for establishing sensitivity and specificity. Indeed, in this study, we have specifically
avoided any assumptions about classes of expressions so as to establish whether a completely unbiassed observer i.e. a computer with a classification programme would detect the expected classes on the basis of configurational properties inherent in certain facial expressions. However, it is possible that the failure to detect the expected range of emotions is simply due to the fact that the numerical data presented to the computer were inadequate or insufficient. It is certainly likely that adults are able to detect facial complexities when decoding an expression not reflected by our measures. It is equally possible that infants are less able to detect such complexities and are dependant on the more clear cut information of the sort tapped by our measures.
References Bowers, D., Blonder, L.X., Fenberg, T. and Heilman, K.M. (1991) Differential impact of right and left hemisphere lesions on facial emotion and object imagery. Brain 114, 2593-2609. Darwin, C. (1872) The Expression of the Emotions in Man and Animals. John Murray, London. Diamond, H. (1856) On the application of photography to the physiognomic and mental phenomena of insanity. Read before the Royal Society, May 22nd. Drag, R.M. and Shaw, M.E. (1967) Factors influencing the communication of emotional intent by facial expressions. Psychon. Sci. 8, 137-138. Ekman, P. (1978) Facial expression. In: A.W. Siegman and S. Feldstein (Eds.), Nonverbal Behavior and Communication. Wiley and Sons, New York, pp. 97-116. Ekman, P. and Friesen, W.V. (1975) Unmasking the Face. Prentice-Hall Inc, New Jersey. Ekman, P. and Friesen, W.V. (1978) Facial Action Coding System (FACS): a technique for the measurement of facial action. Consulting Psychologists Press: Palo Alto. Ekman, P., Sorenson, E.R. and Friesen, W.V. (1969) Pan-cultural elements in facial displays of emotion. Science 164, 86-88. Field, T.M., Woodson, R., Greenberg, R. and Cohen, D. (1982) Discrimination and imitation of facial expressions by neonates. Science 218, 179-181. Gilman, S.L. (1976) The Face of Madness. Hugh W. Diamond and the Origin of Psychiatric Photography. Brunner Mazel, New York. Hunt, W.A. (1941) Recent developments in the field of emotion. Psychol. Bull. 38, 249-276. Katsikitis, M. and Pilowsky, I. (1988) A study of facial expression in Parkinson’s disease using a novel microcomputerbased method. J. Neurol. Neurosurg. Psychiat. 51, 362-366. Katsikitis, M. and Pilowsky, I. (1991) A controlled quantitative
71 study of facial expression in Parkinson’s disease and depression. J. Nerv. Ment. Dis. 179, 683-688. Katsikitis, M., Pilowsky, 1. and Innes, J.M. (1990) The quantification of smiling using a microcomputer-based approach. J. Nonverb. Behav. 14, 3-17. Knudsen, H.R. and Muzekari, L.H. (1983) The effects of verbal statements of context on facial expressions of emotion. J. Nonverb. Behav. 7, 2022212. Pilowsky, 1.. Levine, S. and Boulton, D.M. (1969) The classification of depression by numerical taxonomy. Br. J. Psychiatry 115, 9377945. Pilowsky, I., Thornton, M. and Stokes, B. (1985) A microcomputer-based approach to the quantification of facial expressions. Aust. Phys. Eng. Sci. Med. 8, 70-75. Pilowsky, I., Thornton, M. and Stokes, B. (1986) Towards the quantification of facial expressions with the use of a mathematical model of the face. In: H.D. Ellis, M.A. Jeeves, F. Newcombe and A. Young (Eds.), Aspects of Face Processing Martinus Nijhoff Publishers, Dordrecht, pp. 340-348. Russell. J.A. (1991) In defense of a prototype approach to emotion concepts. J. Pers. Sot. Psychol. 60, 37-47. Russell, J.A. and Bullock, M. (1986) Fuzzy concepts and the perception of emotion in facial expressions. Sot. Cog. 4, 309-341.
Russell, J.A. and Fehr, B. (1987) Relativity in the perception of emotion in facial expressions. J. Exp. Psychol., Gen. 116, 223-237. Shaver, P., Schwartz, J., Kirson, D. and O’Connor, C. (1987) Emotion knowledge: Further exploration of a prototype approach. J. Pers. Sot. Psychol. 52, 1061-1086. Thompson, D.F. and Meltzer, L. (1964) Communication of emotional intent by facial expression. J. Abnorm. Sot. Psychol. 68, 129-135. Thornton, M. and Pilowsky, I. (1982) Facial expressions can be modelled mathematically. Briti. J. Psychiat. 140, 61-63. Wagner, H.L., Macdonald, C.J. and Manstead, A.S.R. (1986) Communication of individual emotions by spontaneous facial expressions. J. Pers. Sot. Psychol. 50, 737-743. Wallace, C.S. (1986) An improved program for classification. Paper presented at the Australian Computer Society Conference, Canberra, ACT, Australia. Wallace, C.S. and Boulton, D.M. (1968) An information measure for classification. Comp. J. 11, 1855194. Wallbott. H.G. (1991) The robustness of communication of emotion via facial expression: Emotion recognition from photographs with deteriorated pictorial quality. Eur. J. Sot. Psychol. 21, 89-98.