Foundations of perceptual theory

Foundations of perceptual theory

acta psychologica ELSEVIER Acta Psychologica89 (1995) 283-295 Book reviews Founding fathers erring in the dark S.C. Masin (Ed.), Foundations of Per...

543KB Sizes 1 Downloads 130 Views

acta psychologica ELSEVIER

Acta Psychologica89 (1995) 283-295

Book reviews

Founding fathers erring in the dark S.C. Masin (Ed.), Foundations of Perceptual Theory. North-Holland, Amsterdam, 1993. Every once in a while, main actors within a certain field of science join to reflect on the main theories within that field. Whether this act should be applauded depends on the field, the theories, and the actors. Perception theory entertaining a close link with modern epistemology, has always invited self-reflective activity. Foundational issues are pressing now, as the science of perception has revolutionized itself by the ascent of new nonlinear systems approaches. The premonitions of a paradigm shift may upset some of the protagonists of the established approach and fill others with enthusiasm. In either case self-reflexive activity is fostered. At best this leads to a programmatic statement, at worst to a nostalgic view on how clear everything once seemed to be. Conference proceedings of this kind are therefore expected to be a mixture of facing the challenge and dodging the issue. The many ways to do the latter that are featured in this book, are to some extent compensated by the structure of the symposium. The organizers have elected one or two discussants whose contributions are published along with the articles, together with author's response. This form of debating, which has come more in fashion these days, in some cases proved helpful to provoke a clear statement from an otherwise defensive author. Although most excitement in the present case has been caused by (neo)Gibsonians and connectionists, neither are represented in the volume. Although it's clearly the establishment speaking here, a wide variety of voices can be heard. The main watershed in the book is between those whose primary concern is with information processing, and those whose interests are in description; the latter is taken broadly to include those involved in scaling and fundamental measurement. For the processing ones, the agenda is made up by trying to overcome the limitations of crude mechanicism that has haunted psychology for many decades. Although the computational approach started with the Turing machine as the strongest computational device possible, in trying to constrain its operation in the brain, research 0001-6918/95/$09.50 © 1995 Elsevier Science B.V. All rights reserved SSDI 0001-6918(95)00028-F

284

Book reviews

programs fell back on decomposition and direct localization, both primary strategies for mechanistic explanation (Bechtel and Richardson, 1993). The perceptual system was considered decomposable (Simon, 1973), and its faculties were independently studied. Direct localization is the strategy to attribute control over the different component processes to distinct faculties of the brain. These strategies may be insufficient for more complex problems, because of distributed control and behavior emergent from microscale interactions in the system. Main stream perceptuologists and cognitive neuroscientists, in the face of complexity often stick to these strategies and as a result we are facing a growing stream of increasingly specialized research, of increasingly less significance for the overall issues of perception. This constitutes a crisis in perceptual research, and similarly so in other fields of psychology. In the face of the limitations of direct localization, we may reject decomposition of the perceptual process altogether, as Gibson (1979) has done (but see Gibson, 1966). Cutting deals with the shortcomings of his approach; Proffitt takes on the more principled issue of how to proceed in the face of the complexity of perception. He argues that instead of simple decomposition, we should do justice to the hierarchical nature of the perceptual system, for whose computational level we have to deal with constraints from below (biology) and above (intention and awareness). The latter is an issue for the descriptive contributions. For these, the red thread through the book is the relation between awareness and the reality of the perceiving organism; its world, biology and information process. This issue is explicitly dealt with in the contributions by Pastore, Bozzi, Masin and Vicario. Townsend and Thomas do so indirectly, by studying the axioms by which its contents, impressions and judgements are ordered into scales and spaces. For those dealing with scaling, the issue is how to make clear that scales are not mere descriptions, but impose constraints on process models. My opinions on this volume are somewhat mixed; the overall sound of the book is nostalgic and the quality of the contributions is rather uneven. Nevertheless, there are some important highlights; I will deal with these in the remainder of this review. It is often a bad sign if pendulums are swinging in the beginning of a text. A pendulum swings in Uttal's contribution, but it takes long enough before disappointment finally turns in. His contribution begins hopefully enough with a critique on localization approaches in modern neurosciences. Uttal argues that there are important limitations on the utility of modern localization techniques. He doubts if they are fit for explaining brain functions such as perception. These techniques are either too microscopic, focussing on the chemical processes of synaptic transmission, or too macroscopic, studying the relationship between large chunks of the brain and behavior. The proper level must be understood in terms of the organization of the neural network. Known research techniques provide no information at this level. (Uttal seems to ignore the recent studies of synchronization of oscillatory brain activity in the visual cortex of cat and monkey, both of small cell assemblies and between cortical areas, by Eckhorn, Singer and their colleagues (Eckhorn et

Book reviews

285

al., 1990; Gray et al., 1990)). Uttal concludes that we may therefore have to settle with descriptive approximation. So far, so good; new descriptions could set a new coordinate system, in which to transform our understanding of the domain and set the stage for a new theory. There is a wealth of plausible options for such an approach. Uttal lists a number of promising mathematical techniques that have come available: chaos theory (Townsend, 1990; Skarda and Freeman, 1987), quantum mechanics (Bennett et al., 1989) wave theory (Link, 1992). Some of these approaches may promise indeed a multilevel account of the perceptual system along dynamic principles, maintaining the link with biological phenomena and at the same time taking perceptual awareness as real. Biology enters the hierarchy from below as constraints of the microlevel; states of perceptual awareness constitute the molar level of description. Influence from above, such as the contents of awareness, intentions and perceptual strategies, could thus be accomodated in such a research program as macroscopic, order parameters of the system, which in turn influence the system through reentrant connections. Most importantly, this can be done without any reference to a mystery ghost in the machine. But at this point Uttal makes a strange retreat. Instead of welcoming these new possibilities, he concludes from his pessimistic evaluation of current neuroscience that the pendulum swings back to behaviorism. Not that for Uttal, mediating concepts between stimulation and overt behavior should be eliminated. But overt behavior measurement is the only accepted criterion to evaluate each individual intervening concept (pp. 31-32). It is unclear whether he would be prepared to save in some form the notion of perceptual awareness. He might accept a definition that comes close to behaviorism, such as in terms of behavioral dispositions. Such a concept of awareness is not necessarily private; it could be specified in terms of a Gibsonian affordance, or any other respectable theory of function. A disposition, however, may or may not lead to manifest behavior. I take it, therefore, that Uttal would choose to forego on such a concept. Uttal makes explicit that he rejects introspective methods. He might therefore have failed to distinguish between introspection and perceptual awareness. In doing so, he operates in the tradition of classical logical empiricism, which recognizes only sensory awareness and cognitive reflection. But for this position one might expect hardly to find support today. For a science thus disrupted of evidence from above and from below, it is clear that Uttal can claim only a modest role. Only in some instances one can find constraints from behavior to eliminate some elements of the description in favour of more behaviorally consistent ones. Uttal's restrictive attitude on what evidence to accept, leads him as so many before, on the paths of speculation. Later on, when one of the discussants, Daniel Robinson, points out to him that through his emphasis on description of the mental events what he proposes is in fact a cryptophenomenologist, he readily accepts this epitheton. Tsotsos arrives at a position similar to Uttal, though not through a diagnosis of

286

Book reviews

the shortcomings of neurological investigation, but through reflecting on the inability of computational models to solve "hard" problems in reasonable amounts of time. Hard problems require exponential time a n d / o r storage capacity to be solved by brute force. Such computationally hard problems are pervasive in perception: visual search for an unknown target is just one example. Tsotsos nc~tes a parallel with evolution, which doesn't provide solutions that are optimal in worst cases, but ones that work just well enough to survive in normal circumstances. Evolution could be said to have fostered the development of similar heuristics in perception, which makes this approach biologically plausible. We are therefore left with a collection of special case solutions to perceptual problems. We seem, thus, to be on our way the promised land of direct localization. For the myriads of perceptual problems, evolution has developed myriads of tricks; given this wealth, perceptuologists interested in decomposition and localization will be certain ultimately to find whatever they are looking for. Even to Tsotsos, such a bag of tricks seems more like Pandora's box. He therefore wonders if some descriptive theory could tie up this bag again. But a similar pendulum as with the previous author, seems at work in Tsotsos' mind; the swing forward reached by the impetus of his ideas, is followed by a swing backwards, ending in behaviorism. My guess is that he might instead have preferred Simon's (1979) computational realizability constraints. Simon's satisficing principle (process cost optimization) may even survive a paradigm shift to the dynamic systems approach, when conceived as approaching the nearest local minimum. The harvest of these debates is that localization strategies alone simply will not do in dealing with issues of perception. It was noticed that problems of localization can be viewed, more generally as problems of mechanistic explanation in perception theory. This point is taken up by Proffitt. He argues that we should treat perceptual systems, like other biological systems as control hierarchies (Pattee, 1973). This implies that the functioning of the system at a certain level (e.g. the firing of a neuron) should be treated as subject to initial conditions (from below), such as energies that innervate the system and boundary conditions (from above). This distinction delimits the role of mechanistic explanation, according to Proffitt. Whereas initial conditions can be treated ahistorically and mechanically, boundary conditions are specified by the function of the system. In other words, physical energy enters the system from below; history from above. Perceptual science is criticized for being ahistorical; it focuses on components, but in analogy with a piano; we may specify what it could do by studying its components, but not what its proper function is: playing music. Gibson's ecological approach to perception is, by contrast interested in the boundary conditions. But this view is criticized for being nonhierarchical, i.e. uninformed by constraints from scale, neither from the intentional above, nor from the biological below. This criticism strikes me as unjustified, at least for the neoGibsonian approach. Whatever else one may think of it, it did explicitly introduce cross-scale constraints in their theories (e.g. Kugler and Shaw, 1990). According to Proffitt, ahistorical and historical approaches to the perceptual

Book reviews

287

system are complementary. Hatfield, however, points out that a mechanistic explanation already tacitly presupposes a functional decomposition. Thus the priority is with constraints from above; these should be made explicit, preferably in terms of evolution theory, before one can embark on the study of the mechanism. The problem of perception theory is that this issue has been reversed and crude mechanistic explanations have become dominant. Proffitt counters that instead of priority of either view there may be convergence. That there are reasons to doubt this can be taken from Cutting's contribution. Cutting's treatise on Gibsonian ecological realism centers on the duality of biological and cultural ecologies, the notions of invariant and one-to-one mapping. All the arguments listed in this paper have been presented by the same author elsewhere, so it may be felt needless to reiterate these issues. Needless perhaps, if there was only a tendency on behalf of the ecologists to take these issues serious. It is crucial for Gibsonian ecological realism that there is no fundamental distinction between biological and cultural ecologies. According to Cutting, however, this position is a form of wishful thinking. From this however, even Proffitt might seem to suffer; if the cultural and biological notions don't merge already within the Gibsonian framework, it is unclear how they should, if we first restrict Gibsonianism to cultural events, and then try to merge it with biological constraints conceived in a manner that doesn't even claim affinity with cultural events. Thus what started promising in Proffitt's account, hierarchichal theory, seems to end in sheer eclecticism and duality. One of the merits of the authors mentioned so far, is that they have seriously taken up the challenge of discussing the foundations of perceptual theory. The others are predominantly self-reviews. As such, some of these are of outstanding quality and would have done very well in a handbook of perception, but one sometimes wonders what they are doing in a volume like this. Gogel reviews his own work that has found wide appreciation, showing that a logic of perception is not needed to explain veridical and illusory perception of space and motion. In his highly elegant account he relies on a phenomenal geometry. He argues that the perceptual system "computes" a variety of spatiotemporal percepts by making use of only three parameters of the perceived spatial layout: perceived direction, perceived distance and perceived head motion. These parameters are estimated by making use of cues in the environment. Only in the last 5 pages the relevance of this work for the foundations of perceptual theory is touched upon. This section is disappointing, insofar as it reiterates the earlier conclusion that a logic of perception is obsolete. Moreover, whether mental algebra or inductive logic are needed to explain perceptual processes is an internal debate between representationalists. In the face of the fact that in the current debate, representationalism itself has come under fire, a more principled defence would be required. Any account based on cues (like Gogel's) has to face the challenge put up by Gibsonians: not space, depth and cues but layout, surfaces and the optical array. According to Gibsonian orthodoxy, these provide immediate specification of the relevant information, rather than that cues are used to parametrize some sort of

288

Book reviews

mental algebra. A debate centered on these issues would have brought up the questions of what constitutes perceptual information, the role of awareness, experience, intention and learning. Instead, a phenomenal world is simply posited, for which as noticed by Burigana, internal consistency has substituted evidence. The really important issue in this context about the status of the phenomenal world (in relation to the physical) is not raised. Townsend and Thomas provide a tour d'horizon on the extensive and impressive work on scaling. They propose an approach, in which a foundational approach to measurement that deals with assumptions about properties of scales is applied to multidimensional scaling techniques. Such an approach would yield important insights on the properties of the similarity spaces used in making perceptual classifications. For the authors, the real challenge, however, is how the results obtained from these techniques could be wed to process models. Whereas sophisticated scaling techniques often use non-euclidian metrics, or less than metric spaces, process models use real-valued functions in usually euclidean metric spaces. They judge this a failure of the latter to reflect on the nature of the information represented. It may in reality be more "fuzzy" than assumed by the model. Measurement theory could be of assistance here. Townsend and Thomas observe that there is a gap between the scaling and processing approaches. But unfortunately the interesting question how it could be bridged is only briefly touched upon (p. 338). An option suggested is to "fuzzyfy" the behavior of a model through adding noise. The authors do not mention the possibility of using chaos and nonlinearity as an intrinsic source of variability. Ashby and Lee launch an attack on the fundamental nature of the notion of similarity. According to their view, the more fundamental notion is categorization. In the face of the intrinsic variability in the information process, all identification responses also fall under the scope of categorization processes. This basic axiom constrains perceptual theorizing, because it compromises any theory positing different mechanisms for categorization and identification. Theorems of optimal categorization lead the authors to predict sensitivity to covariance structure within categories. It turns out to be optimal to have sets of negatively correlated detectors, and a way to achieve this is through lateral inhibition between detectors. Whereas the authors consider these arguments in favour of the plausibility of their categorization models, they expect rivalling similarity models to lead to unfavourable consequences: "One way, and perhaps the only way, for similarity models to exhibit sensitivity to covariance structure is to assume that the subject compares the percept of the presented stimulus to the entire distribution of percepts that have been associated with each stimulus alternative ... a simple argument of parsimony must cause us to seriously question ( . . . ) the validity (of this assumption)" (pp. 337-338). We might agree with Ennis, however, who notices the possibility in favour of similarity models, that a random sample of this distribution is taken instead. Unfortunately, Ashby and Lee pay no attention to this remark in their rejoinder. So one keeps wondering whether to find their arguments convincing.

Book reviews

289

Neither Townsend and Thomas, nor Ashby and Lee, who focus on linear systems, show much concern about the role of nonlinearity in perception. On the whole their emphasis is on detection in sensory processes, for which the linear models are likely to provide a relatively good approximation. Near-equilibrium systems are well-approximated by linear ones, but far-from-equilibrium systems are not. These are characterized by states of intermittency and chaos. Skarda and Freeman (1987) have shown for the olfactory bulb in preparation for a stimulus enters a chaotic regime. This is characteristic of a nonlinear system for which the transition from chaos to order is hierarchically controlled. This transition plays an essential role in the pattern recognition process. In general, if hierarchical control is assumed in a nonlinear dynamical system, it is unlikely that the long term behavior of the system is stable. Otherwise, all activity would come to rest. The system may be disrupted from an equilibrium by a change in environmental circumstances, but also through an intrinsic instability, such as a shift in intention. It might even occur if neither stimulus nor intention is changed, as perceptual multistability studies (e.g. Peterson and Hochberg, 1983) demonstrate. To come to a final balance about this book: positive about it is that it offers scholarly reviews by some of the major actors in the field of perception. The outstanding importance of the contributors invokes ones curiousity about their views on the foundational issues of perception theory. Unfortunately, only some of them have taken up this challenge. Where one might have expected programmatic statements in the face of a paradigm shift, the attitude of the majority of the contributions is defensive, and sometimes even slightly stale. The question could be raised, why a volume like this should be published under the title of foundations of perceptual theory. What I missed is an integrative statement, to wrap it all up. Given that also what is good about the book is not very new, I felt inclined to paraphrase Occam: books should not be multiplied beyond necessity. Cees van Leeuwen Faculteit Psychologie Universiteit van Amsterdam Roetersstraat 15 1018 WB Amsterdam The Netherlands E-mail: [email protected]

References Bechtel, W. and R.C. Richardson, 1993. Discovering complexity. Princeton, NJ: Princeton University Press. Bennett, B.M., D.D. Hoffman and C. Prakash, 1989. Observer mechanics. San Diego, CA" Academic Press. Eckhorn, R., H.J. Reitboeck, M. Arndt and P. Dicke, 1990. Feature linking via synchronization among distributed assemblies: Simulations of results from cat visual cortex. Neural Computation 2, 293-307.

290

Book reviews

Gibson, J.J., 1966. The senses considered as perceptual systems. Boston, MA: Houghton Mifflin. Gibson, J.J., 1979. The ecological approach to visual perception. Boston, MA: Houghton Mifflin. Gray, Ch.M., A.K. Engel and W. Singer, 1990. Stimulus-dependent neuronal oscillations in cat visual cortex: Receptive field properties and feature dependence. European Journal of Neuroscience 2, 607-619. Link, S.W., 1992. The wave theory of difference and similarity. Hillsdale, NJ: Erlbaum. Kugler, P.N. and R.E. Shaw, 1990. 'Symmetryand symmetry-breakingin thermodynamicand epistemic engines: A coupling of first and second laws'. In: H. Haken and M. Stadler (Eds.), Synergetics of cognition (pp. 296-331). Berlin: Springer-Verlag. Pattee, H.H., 1973. 'The physical basis and origin of hierarchical control'. In: H.H. Pattee (Ed.), Hierarchy theory: The challenge of complex systems (pp. 71-108). New York: Braziller. Peterson, M.A. and J. Hochberg, 1983. Opposed-set measurement procedure: A quantitative analysis of the role of local cues and intention in form perception. Journal of Experimental Psychology:Human Perception and Performance 9, 183-193. Skarda, C.A. and W.J. Freeman, 1987. How brains make chaos in order to make sense of the world. Behavioral and Brain Sciences 10, 161-195. Simon, H.A., 1973. 'The organization of complex systems'. In: H.H. Pattee (Ed.), Hierarchytheory: The challenge of complex systems (pp. 1-27). New York: Braziller. Simon, H.A., 1979. Models of thought. New Haven, CT: Yale University Press. Townsend, J.T., 1990. 'Chaos theory: A brief tutorial and discussion'. In: A.F. Healy, S.M. Kosslynand R.M. Shiffrin (Eds.), From learning processes to connectionist theory: Essays in honor of William K. Estes (vol. 1, pp. 65-96). Hillsdale, NJ: Erlbaum.

J.W. Payne, J.R. Bettman and E.J. Johnson, The Adaptive Decision Maker. Cambridge University Press, Cambridge, 1993. As a reaction to inferring decision processes from decision outcomes, process tracing research aims at registering behavioural responses during the decision process itself, in order to gain more insight into the cognitive mechanisms underlying decision making behaviour. Payne, Bettman and Johnson (henceforth referred to as: PBJ) made a major contribution to this line of research, which is well illustrated by the book under consideration. Traditional decision making research mainly compared decision making behaviour to normative models, such as Bayes' theorem. This has generally led to the conclusion that people are rather poor decision makers. However, as argued by Anderson (1990), even though decision behaviour can be considered irrational with regard to some normative (statistical) outcome, it can be optimal in terms of achieving the decision maker's goals (such as saving maximum time). Furthermore, such normative models generally neglect computational costs. The main contribution of PBJ to decision making research is that they have included the effort that is required to implement a decision strategy into their analyses of decision behaviour. More specifically, they assume that decision makers aim at accurate decisions while investing as little cognitive effort as possible. In the introductory chapters PBJ set out to describe decision making as a highly contingent form of information processing, as indicated by many process tracing studies. In order to make a choice between several alternatives, people can select from a set of decision strategies such as, for example, a weighted additive rule or