The Sense of Consciousness

The Sense of Consciousness

J. theor. Biol. (2001) 211, 377}391 doi:10.1006/jtbi.2001.2355, available online at http://www.idealibrary.com on The Sense of Consciousness ARNOLD S...

234KB Sizes 2 Downloads 85 Views

J. theor. Biol. (2001) 211, 377}391 doi:10.1006/jtbi.2001.2355, available online at http://www.idealibrary.com on

The Sense of Consciousness ARNOLD S. TANNENBAUM* Department of Psychology and Institute for Social Research, ¹he ;niversity of Michigan, Ann Arbor, MI, 48109, ;.S.A. (Received on 1 February 2000, Accepted in revised form on 11 May 2001)

I propose that consciousness might be understood as the property of a system that functions as a sense in the biological meaning of that term. The theory assumes that, as a complex system, the sense of consciousness is not a "xed structure but implies structure with variations and that it evolved, as many new functions do, through the integration of simpler systems. The recognized exteroceptive and enteroceptive senses provide information about the organism's environment and about the organism itself that are important to adaptation. The sense of consciousness provides information about the brain and thus about the organism and its environment. It senses other senses and processes in the brain, selecting and relating components into a form that &&makes sense''*where making sense is de"ned as being useful to the organism in its adaptation to the environment. The theory argues that this highly adaptive organizing function evolved with the growing complexity of the brain and that it might have helped resolve discrepancies created at earlier stages. Neural energies in the brain that are the input to the sense of consciousness, along with the processing subsystem of which they are a part, constitute the base of consciousness. Consciousness itself is an emergent e!ect of an organizing process achieved through the sense of consciousness. The sense of consciousness thus serves an organizing function although it is not the only means of organization in the brain. Its uniqueness lies in the character of the organization it creates with consciousness as a property of that organization. The paper relates the theory to several general conceptions *interactionism, epiphenomenalism and identity theory*and illustrates a number of testable hypotheses. Viewing consciousness as a property of a sense provides a degree of conceptual integration. Much of what we know about the evolution and role of the conventionally recognized senses should help us understand the evolution and role of the sense of consciousness, and of consciousness itself.  2001 Academic Press

1. Introduction Cognitive scientists view consciousness as serving an information processing function although most information processing occurs unconsciously. Nonetheless, in this view, both the conscious and unconscious systems serve a common function and, to that extent their roles overlap. This functional overlap suggests also a structural * E-mail: [email protected] 0022}5193/01/160377#15 $35.00/0

parallelism; the two systems coexist &&side by side'' in the brain and they are probably constructed in much the same way except, of course, one does something that the other does not (see, e.g. Crick & Koch, 1998). The parallelism and functional similarity of the conscious and unconscious subsystems illustrate a general feature of complex biological systems. From its inception, as it divides from one cell into two, a zygote's growing complexity is characterized by parallel functional overlap. The brain  2001 Academic Press

378

A. S. TANNENBAUM

itself, with its substantial parallel and reciprocal connections, is notable for such complexity although this form of complexity is by no means con"ned to the brain. Subsystems that serve the same or similar functions have adaptive advantages. They provide redundancy through which one subsystem can supplement the other or serve as a backup in case of malfunction. One subsystem might also serve as a control or check on the other as, for example one eye or ear con"rming the other. Parallel subsystems can enhance the organism's #exibility by providing alternative paths to an outcome and, perhaps most importantly from the point of view of mental development, functionally overlapping subsystems can be part of an emergent, synergistic synthesis when the output of one di!ers from or con#icts with that of another. The emergence of depth perception (stereopsis) through the synthesis of two somewhat discrepant images from the eyes, illustrates this synergistic process. The "ve classically established senses*vision, audition, olfaction, touch and taste*are a small and in some ways misleading sample of the human senses. Numerous senses pervade the organism. Some, the kinesthetic, lead to sensations that are referred to as &&deep sensibility'' (Rose & Mountcastle, 1959). Some senses with receptors in the cardiovascular system, the viscera, muscles and tendons function outside of consciousness and were understandably overlooked in traditional treatments of the subject. Indeed, the classical senses themselves often function outside of consciousness, contrary to traditional conceptions. Furthermore, some senses are an integration of other senses thus adding new senses on top of the old. The stereoptic sense, for example, is an integration of the monocular senses just as the monocular senses are an integration of the more elementary sensory units of rods and cones and their connected a!erent neurons. In addition, the number and variety of sensory systems is accompanied by a variety of receptor mechanisms not initially envisioned. Sensing need not depend on prominent organs or on complex neural formations or appendages; bare nerve endings serve as common receptors (Malinovsky, 1987). Senses monitor vital organs as part of the means of maintaining the organs' proper opera-

tions and there is no reason to think that the vital and complex activities of the brain are excluded from control processes that entail sensing. In fact, the brain monitors and controls many aspects of its own operations (e.g. Churchland, 1988). The brain senses itself*but sensing need not be conscious. Where does the consciousness come from? I approach this question in the next section by proposing a biological sense of consciousness. I consider how this sense might have evolved and how it might be situated in the brain as a complex &&structure with variations''. Section 3 considers the emergence of consciousness as a property of this structure and Section 4 suggests some of the neural mechanisms that might contribute to this emergence and to the interaction between consciousness and objectively observable aspects of the brain. Section 5 illustrates features of the sense of consciousness that are common to sensory systems and that help to identify the sense of consciousness as a sense among others. The work of ethologists suggests that the sense of consciousness could have evolved along with other senses in non-human species. The commonalities among sensing systems, biological and nonbiological, provide a context for understanding impressive di!erences between the senses. The uniqueness of consciousness is considered in this context. Section 6 illustrates a number of testable hypotheses suggested by the theory and Section 7 considers some of its philosophic implications. 2. The Sense of Consciousness James (1890) referred to consciousness as an &&organ'' that evolved like other organs. Consciousness, in his view, is a &&selecting agency'' providing order to a complex and otherwise unpredictable brain by selecting only some of the very many possibilities o!ered by the brain (pp. 138}139; 288}290). I think it can be helpful, as a modi"cation of James' notion, to understand consciousness as an emergent property of a system that functions as a sense although not through a "xed, invariant structure. 2.1. EVOLUTION OF THE SENSE OF CONSCIOUSNESS

I assume that a new sensory function might be created in either of two ways. One is through the

SENSE OF CONSCIOUSNESS

mutational addition of a new cell or tissue sensitive to a form of stimulation to which the organism was heretofore insensitive. The "rst cells that responded to light very early in evolutionary history, and that led ultimately to the evolution of complex eyes illustrate this form of development. The second form of development is through the synthesis of inputs from a number of receptors that are not themselves new types of receptors. The mutation in this case is not a new receptor but a rather new integrating structure. The sense of visual depth illustrates this form of development. It "rst appeared when an organism developed the capacity to synthesize into a unique informational pattern visual inputs from like vision receptors that were somewhat discrepant in their outputs. Stereopsis was not a simple addition or averaging of inputs from the eyes (any more than monocular vision is a simple addition of inputs from the receptor rods and cones) but was qualitatively quite distinct. It was a new and emergent sense. Nor did its occurrence incapacitate the monocular senses, the ordinary e!ects of which might become apparent when one eye is closed or moved out of alignment with the other. The stereoptic sense, rather, sensed and took as input the product of the more elementary and still-functioning monocular senses. It incorporated the monocular senses. According to Lorenz (1973), many complex functions evolved through such an integration of existing simpler systems that &&continue to function as essential components in the new reality'' (p. 43). All complex senses probably emerged in this way as did, I suggest, the sense of consciousness. The primitive, prototypic nervous system, a simple neural chain proved in its day to be a highly e!ective survival tool. Successful mutations early in evolutionary history added parallel and interacting neural subsystems*all without consciousness. A contemporary outcome of this evolutionary process, the human nervous system, is composed of a multiplicity of such subsystems that function without consciousness as they did in our early phylogenetic ancestors*except that the sensing by one subsystem of another within the brain is a sensing, not of raw data like light waves striking a retina, but of processed information including synthetic products of simpler sensory data as, for example, the product illustrated

379

by stereopsis. In addition, I assume that the informational product of the sensing must di!er from the sensed pattern in at least a couple of ways. First, it must fall, if ever so slightly, temporally behind the energy #ow that is its source in the brain since its processing is delayed by its downstream distance from the source. It is therefore &&out of step'' with, and like an echo of its source in the brain. Second, being a product of a sensing it will not replicate the source precisely; it will necessarily di!er from the source in ways dictated by the character of the sensing mechanism. We thus have a discrepancy between the information in the central source and the information in a parallel central information processing subsystem. The systems are therefore in some degree discrepant; one information processing system is &&telling'' the organism one thing and the other system is &&telling'' it something else, creating a state of ambiguity. The organism, after all, although without consciousness, relies on this information in its adjustment to the environment. The survival of such con#icts or ambiguities, I assume, lies partly in their genetic association with other characteristics that contribute to survival such that the net e!ect of the genetic package of e!ects is advantageous. In the present instance the con#ict is a consequence of the complexity implicit in parallel functional overlap that might introduce compensating bene"ts (supplementation, backup, #exibility, etc.). In fact, the e!ects of the con#ict might even prove advantageous since a contradiction between components within one framework or structure need not be a contradiction within another and, in general, complex systems might be capable of resolving some contradictions that are irreconcilable within simpler structures. Thus, given time and the trials and successes of the evolutionary process, some organisms might have acquired the capacity to cope with, and even to exploit, the contradictory implications associated with complexity. In the course of time and of mutations, structures might have evolved that were capable of integrating and thus unifying heretofore contradictory components. Early in the evolutionary history of vision, for example, a cell or set of cells sensitive to light informed an organism about the presence of light-re#ecting or generating objects in its environment. A second cell or set might

380

A. S. TANNENBAUM

have then evolved that served the same function, but from a di!erent angle resulting in two discrepant information #ows. Ultimately the organism evolved a complexity that enabled it to synthesize the discrepant informational inputs that not only allayed the ambiguity of the con#icting information but that created an additional and entirely new sense, the sense of visual depth. The new and di!erent sense did not grow out of a new and di!erent sensor. Rather it emerged as a synergistic synthesis when the output of one old-fashioned sensor was integrated functionally with that of another old-fashioned sensor. It emerged as a product of parallel functional overlap. And it was probably not accompanied at an early stage by consciousness. The sense of consciousness might have emerged in this way through the evolution of neural structures that synthesized discrepant energetic inputs from the brain*except that the synthesis in this case was more complex than that of some of the earlier developed senses. Given the complexity of the modern human brain with its massively parallel and reciprocal connections, the sensing within the brain is reiterated and its synthesized e!ects compounded. Edelman (1989), for example, describes cortical maps located on sheets in the brain, with points corresponding to receptor points on the organism (e.g. the skin or retina). Large numbers of such maps signal mutually and recursively through a dense set of neural "bers*a process that Edelman refers to as reentry. Parallel connections also imply simultaneous processes in addition to the recursive, viz., parallel neural systems simultaneously sensing a source in the brain, each from a particular vantage point just as each of two eyes senses an object from a di!erent angle. In the brain, however, the analogy is of a multiplicity of eyes sensing from a variety of angles. The evolving neural mechanisms that achieved this dynamic, multifaceted mapping are, I suggest, the foundation of the sense of consciousness. The evolving sense, however, confronted in the brain a complicated environment that required organization to be useful. Evolution might have provided several mechanisms to create a more orderly system. Some degree of organization might have been achieved simply through a

canceling of &&errors'' whereby the large number of inputs, each of which diverged from a central tendency, achieved some reliability when summed. A second mechanism, I assume, was more synergistic; it depended on a complexity such that patterns, complicated and incoherent within a simple information processing system, were accommodated coherently within the more complex and &&intelligent'' framework where discrepant informational inputs, such as those from two eyes, could be synthesized in ways that enhanced their adaptive value. A third mechanism, I surmise, contributed to the capacity of the evolving sense of consciousness to bring order to the brain. By chance, and in steps, the evolving sense acquired the capacity to scan, as did some other senses except that the scanning was executed through a pervasive network of neurons. Neurons within the network were activated selectively through a switching mechanism rather than through gross physical movement, a process that might be achieved partly through the thalamus as an organ of attention. Crick (1984), for example, suggests a &&searchlight hypothesis'' that implies a scanning and organizing function served by the thalamus. Reentry, described by Edelman (1989), in which new selectional properties emerge as the process moves recursively forward also suggests a tracking and organizing function. Electromagnetic oscillatory activity synchronized between neural groups might also play a role including, (a) a temporary binding of spatially disparate neural groups into an information processing unit, (b) a gating that enables the rerouting of information, and (c) a matching of information optimizing transfer between networks (Lopes da Silva, 1996). These mechanisms might be understood as possible components of the sense of consciousness that contribute to its scanning and organizing action. Thus the sense of consciousness might have acquired the capacity to scan and organize its environment into useful patterns, just as some other senses acquired the capacity to scan and to organize their environments. Scanning by the senses is directed by conditions both within the organism and within the environment. A predatory animal directed by internal conditions of hunger scans the environment for the scent, sound or sight of prey;

SENSE OF CONSCIOUSNESS

a rutting animal scans for the scent, sound or sight of a mate. Once detected, the senses focus and &&lock in'' as the animal concentrates e!ort to track its quarry. I assume that scanning by the sense of consciousness is directed similarly by conditions within the organism and its environment except that the brain itself, as an environment in its own right, calls attention to conditions within it. These internal conditions, along with others in the organism and in the larger environment direct the tracking process as the sense of consciousness pursues its quarry. The sense of consciousness in the predator, for example, directed by a state of hunger, scans for and &&locks into'' information about the scent, sound or sight of prey which in turn tracks to information about a site where the predator encountered prey in the past, a site that, in the case of the human animal, might be represented in the cortex of the temporal lobes and its deeper structures where information about past events is retained (e.g. Ingvar, 1994). Such internal sensing is an alternative to the sensing in the animal's larger environment and, in directing the animal to the site, it avoids the time, e!ort and danger of an extended prowl. The functional overlap of the sense of consciousness with other senses illustrates the supplementation and partial backup that overlap might provide. It illustrates also the control function as the internal and external senses check one another while the animal proceeds carefully to the site. The exteroceptive and enteroceptive senses provide information about the organism's environment and about the organism itself that are important to adaptation. The sense of consciousness provides information about the brain and thus about the organism and its environment. It senses other senses and processes in the brain and provides information that is more highly organized than that provided directly by the other senses, selecting and relating components into a form that &&makes sense''*where making sense is de"ned as, being useful to the organism in its adaptation to the environment. This highly adaptive organizing function, I assume, evolved with the growing complexity of the brain and it might have resolved discrepancies created at earlier stages. At one early stage, for example, the brain might have acquired the capacity to represent

381

the shape of an environmental object. At a subsequent stage it might have acquired the capacity, located separately in the brain, to represent color. Each of these representations provided information that might have been useful to an organism, but eventually the brain acquired the capacity to connect the separate components creating a &&corrected'', more veridical*and more &&sensible''* unitary representation of a colored object in the environment. Some of these connections, according to Crick (1984), might be expressed by cell assemblies that were partly or wholly established genetically or through prolonged learning, but others are temporary constructions achieved through the thalamic attentional mechanism. Consciousness, I suggest, is an emergent, synergistic result built on such feats of organization. Studies of brain functioning suggest locations that might play host to the sense of consciousness and that could provide the components of its changing structure. The sense of consciousness might have its roots in the brain stem and thalamus. Newman (1995) notes the many reciprocal connections and the highly iterative information #ow that is likely between thalamus and virtually every region of the cortex and he presents a variety of research "ndings showing the brain stem and thalamus to be closely linked with processes in the cortex that mediate attention and consciousness. The location and structure of the sense of consciousness, however, change as it incorporates and integrates with the neural systems being sensed. Furthermore, the sense interfaces with a brain that conforms to a complicated architecture. Informational elements in the brain that represent contiguous or continuous features in the organism's environment, for example, need not be connected seamlessly in the brain. Nor are the processes in the brain that yield a uni"ed conscious experience themselves necessarily uni"ed (e. g. Sperry, 1984). Given the structure of the brain, the path tracked by the sense of consciousness, and thus the sense of consciousness itself, might be conceived as a set of dispersed patches or groups of neurons that are synergistically organized as the sense of consciousness moves among sets of locations incorporating the sensed systems as it proceeds. A possible mode of organization among neuronal groups in the brain might be achieved, as Tononi

382

A. S. TANNENBAUM

& Edelman (1998) suggest, through relatively rapid and intense recursive interactions between the groups, resulting in a functional cluster that they refer to as a &&dynamic core''. The spatial binding of neuronal groups through electromagnetic wave activity that I discuss below might also play a role. Thus the sense of consciousness forms itself into a uniquely organized albeit topographically irregular and changing structure. Consciousness is a property of this dynamic organization. As a property of an emergent system, consciousness has qualities and implies rules different from (although in no way contravening) those that apply to the system's components. The unique and seemingly ethereal quality of consciousness is one such property. The sense of consciousness thus serves an organizing function but it is certainly not the only means of organization in the brain. Its uniqueness lies in the character of the organization it creates that has as an emergent property, the peculiar physiological/ psychological quality we call consciousness. 3. Consciousness as an Emergent of a Complex System The conception of consciousness as an emergent property has been discussed by a number of authors (see e.g. Arhem & Liljenstrom, 1997; Blomberg, 1994; Churchland, 1988; Hesslow, 1994; Roland, 1994; Tononi & Edelman, 1998; Walter, 1998). In his attempt to explain mental phenomena, James (1890) anticipated a tenet of contemporary complexity theory, sensitive dependence on initial conditions, when he described the brain's natural state of &&unstable equilibrium'' and &&hair-trigger organization'' in which a very slight impression might have major, unforeseeable consequences (p. 139). James' notion is consistent in principle with the argument from complexity theory that the emergence of mental states is explained by the nonlinear dynamics of complex systems in the brain operating close to &&instability points'' (Mainzer, 1996). Each such point implies a location among interacting elements where a minute disturbance can result in a radical or abrupt transition from one state or condition to another*a phase transition. The emergence of consciousness might be viewed as analogous to a crystallization that originates

with such nonlinear transitions occurring among a huge number of chaotically interacting elements. Crystallization might be an ongoing process like that of a snow#ake changing form, adding branches and losing others while moving through an atmosphere of varying temperature and humidity. In the brain, of course, the process is more complex than this analogy suggests. I assume that the sense of consciousness, as a complex system, is not a "xed structure but is a &&structure with variations'', to use the expression of Golden"eld & Kadano! (1999). It is open-ended and undergoes constant change. It grows or develops from within as well as from the e!ect of external in#uences. It is a driven system with an internal, self-organizing dynamic like that of many complex systems and its components are not bound with any permanency; they are sometimes in and sometimes not in the organization (e.g. Bak & Chen, 1991; Kau!man, 1991; Schmid-Schonbein, 1996). Components thus might come and go and the organization itself might cease to exist or become latent and then reestablish itself (perhaps resembling in these ways some social organizations more than the organization of most compounds and molecules). In addition to its ephemerality, the system is characterized by the never-repeating actions that typify complex systems and it is not likely to exhibit the degree of regularity and predictability that might be exhibited by some other systems. Nor is it centrally controlled yet; as a complex adaptive system it maintains coherence in the context of change (e.g. Holland, 1995). Furthermore, the organization is unusually rich in information, representing as it does a vast reduction in uncertainty posed by the many possibilities in the brain (Tononi & Edelman, 1998). The organization is nonetheless physical in every sense although the above qualities, along with its di!used structure and spatio-temporal complexity render it more elusive and resistant to analysis than are most of the systems studied by scientists. 3.1. SPATIO-TEMPORAL COMPLEXITY IN THE ORGANIZATION OF CONSCIOUSNESS

All systems, of course, exist in time}space but the temporal dimension in the organization of

SENSE OF CONSCIOUSNESS

consciousness, I assume, has special signi"cance. If consciousness is to be understood as the property of a sense its manifestation should follow in time the neurological events that are sensed, particularly since the organizing function of the sense of consciousness must be time consuming for reasons beyond the simple fact that neurons require time to transmit impulses. First, the initial phase of the sensing process, scanning, takes time. Neural events that are to be sensed*for example excitation in the somatosensory cortex from a stubbed toe*are not &&caught'' by the scanner immediately upon arrival in the brain any more than prey is sensed by predator immediately upon entering the latter's perceptual range. Furthermore, the initial sensing is merely the start of a &&tracking'' and organizing process that eventuates in consciousness at a later stage. Dennett (1991), for example, illustrates the developmental process leading to visual experience. &&Visual stimuli evoke trains of events in the cortex that gradually yield discriminations of greater and greater speci"city2 e.g. "rst [time 1], mere onset of stimulus, then [time 2] location, then [time 3] shape, later [time 4] color (in a di!erent pathway), later still [time 5] (apparent) motion, and eventually [time 6] object recognition'' (p. 134). Full-blown consciousness, that emerges at time 6, is the property of an organization of components that exist at di!erent locations and times that are connected through time and space. Thus, the organization of consciousness has a dynamic, temporal dimension even in the representation of a static visual image since the connections that de"ne the organization exist both simultaneously during given points in time and temporally between points. Consciousness, as a property of that organization, cannot exist without the latter; it must await these developing connections. Consciousness emerges and continues to develop as new and di!erent components are connected to the old, and the consciousness that emerges as new, is a property of an organization that is in some measure old and gone. The physiological stream of which consciousness is a property thus runs parallel to and partly behind the stream that nurtures it. The relationship of consciousness to that stream is not likely to be entirely unilateral, however. Consciousness might insinuate itself into the

383

&&mainstream'', a!ecting the behavior of the organism*and the #ow of consciousness itself. 4. Consciousness as a Cause of Behavior Emergent properties are system properties, which, in the case of consciousness, are causally determined by the properties of the system's neuronal components, their arrangement and interactions (Stephan, 1998; Walter, 1998). But emergent properties have an integrity of their own, and the fact that they are caused need not mean that they do not themselves have e!ects. In considering consciousness as a cause, Sperry (1980) suggests the analogy of a wheel rolling on a downward course. The wheel carries its atoms and molecules along in a way determined by the properties of the wheel as a system. The wheel's shape, whether round or elliptical, for example, is a system property that will a!ect the trajectories of the wheel's elementary components. So too, the &&properties of the brain process, as a coherent organizational entity, determine the timing and spacing of the "ring patterns within its neuronal infrastructure. The control works both ways2'' (p. 201). But as a complex system, the sense of consciousness has features with implications for causality in addition to those apparent in the above analogy. In the case of the wheel, shape is an essentially "xed property. The sense of consciousness, as a complex system implies structure with variations; the shape of the structure here changes. Consciousness, as a property of a highly complex structure is in a constant state of development as the sense of consciousness adds components and sloughs o! others, a process that implies e!ects both within the structure and between it and its environment. As the structure adds new constituents to the old, the new totality is inescapably a!ected by the old. The sense of consciousness thus a!ects its own shape, and hence the course of its own development. It has, one might say, a will of its own. Furthermore, the development entails interaction with the structure's immediate environment in the brain as it acquires components from and engages in energy exchanges with that environment. Thus the sense of consciousness, with consciousness as a causal property, has e!ects beyond itself, and ultimately, beyond the organism.

384

A. S. TANNENBAUM

This conception of the causal e$cacy of consciousness relates to several issues that have persisted as sources of controversy. Lindahl (1997) illustrates these in a brief description of three theories: &The interactionist theory asserts that conscious mental phenomena and neural phenomena are distinct, and that neural events may bring about and in#uence conscious mental events, and vice versa2[T]he epiphenomenalistic theory [also] asserts that conscious mental phenomena and neural phenomena are distinct, and that neural events may bring about and in#uence conscious mental events, but the theory denies that conscious mental events may bring about and in#uence neural events. The identity theory asserts that conscious mental phenomena are neural phenomena. Consequently, the theory rules out any of the causal relations characterizing the interactionistic and epiphenomenalistic theories' (pp. 613}614).

4.1. THE SENSE OF CONSCIOUSNESS AND INTERACTIONISM

I have argued, contrary to the epiphenominalistic view that consciousness has e!ects. The present theory, rather, is consistent in major respects with an interactionist conception assuming, as Lindahl & Arhem (1994) note, that the distinction between conscious mental phenomena and neural phenomena need not imply the distinction between something that is immaterial and something that is material. Popper, for example, does assume the immaterial}material distinction, proposing an electromagnetic force "eld generated by brain physiology as a means of interaction (Popper et al., 1993). The present theory assumes that the emergent system, with consciousness as a property, is integral with the neural system from which it emerges and that in coexisting with the neural system it does not violate the rules of neurons in any way. The emergent system, rather, co-opts while it depends on the underlying system. In so doing it shares energies and materials of that system but utilizes them di!erently, i.e. through a di!erent functional arrangement. The parallel binding of neurons through electromagnetic wave patterns

might make this arrangement possible by providing alternate, non-axial connections between distant groups (Lopes da Silva, 1996). Because of the parallel binding we have two overlapping systems running through neurons, an axial and a non-axial one, each tied to and coordinated with the other through common physical components. Jointly they might be seen to constitute a hypercomplex, emergent system. Viewed in this way, the emergent system intersects and overlaps with the neural system from which it emerges sharing some matter and energies in common. It is this commonality that is a basis for the interaction between the emergent, subjective system and the objectively physical brain. Electromagnetic waves are part of the organizing mechanism that, in synchronizing and binding the activity of distant neural groups, results in a phase transition from chaos to the unique dynamic order that is experienced as consciousness. The interaction between this subjective system and the conventionally de"ned brain, as portrayed here, is an interaction between physical systems. Consciousness, while sensed physically (i.e. biologically) is subjective because the sensing occurs exclusively by and within a single organism. 4.2. THE SENSE OF CONSCIOUSNESS AND IDENTITY THEORY

The present conception resembles aspects of identity theory although I do not think it is correct to say that conscious mental phenomena are neural phenomena if this is meant to imply that the identity is to be found in neurons or patterns of neurons or in speci"c forces generated by neurons. Neurons and measurable energies are certainly implicated but they are not the entities or manifestations that, in the present conception de"ne the identity. As an emergent property of a complex system consciousness exists only as a property of the system as a whole; it does not and cannot exist in parts or components. If one were able to approach the system with instruments, as we do in the laboratory, one might encounter neurons or energies but not consciousness as such even though the neurons and energies are indispensable to consciousness. Furthermore, the changing physical contour of the system is not likely to follow precisely the

385

SENSE OF CONSCIOUSNESS

contours of neurons. Neuronal boundaries establish the space and provide the conditions through which complex and changing activities occur. Some of these activities are de"ning elements of the system but in constituting the system they do not come together in ways that conform to the physical boundaries of neurons as entities. The functional signi"cance of a neuronal spike, for example, can be seen from two perspectives. One is its ultimate e!ect on the release of neurotransmitters as it completes its passage through a neuron. The spike can also be seen to de"ne a changing energetic pattern within a neuron as it runs along the neuronal axis. It is this changing pattern (or aspects of it), I suggest, that might be linked through the action of the electromagnetic oscillatory activity. These changing intraneural patterns, as components of the emergent system, are complex in their own right. The system thus depends upon constant, hypercomplex change de"ned by innumerable elements entering, remaining and leaving the system at di!erent times, rates and locations repeatedly but never in precisely the same way. These complex patterns and changing patterns within and among billions of neurons and many more billions of dendrites preclude objective observation of the system not only because of their spatial complexity but also, most importantly, because of their temporal complexity. Capturing the system in motion* which is necessary for the manifestation of consciousness as a system property*requires a unique sensing mechanism, like the sense of consciousness that moves synchronically with the system. Otherwise the system, as a system is lost. I suggest that the biological sense of consciousness is unique in its ability to achieve this dynamic sensing because it senses the system as it organizes it. But, as an emergent system, its rules are di!erent from those of its elements (Anderson, 1972). Consequently the concepts that describe and explain the behavior of neurons are not entirely adequate at this emergent level; the concepts here must be appropriate to the subjective, mentalistic manifestations of consciousness. I thus take the present conception to be consistent, in principle, with Sperry's monist view encapsulated in the title of his 1980 paper, &&Mind}brain interaction: mentalism, yes; dualism, no''.

5. The Sense of Consciousness and Other Senses Viewing consciousness as a property of a sense provides a degree of conceptual integration; consciousness belongs within the framework of the biological senses. Understanding the functions of consciousness, as Allen & Beko! (1997) suggest, requires an understanding of the functions of sensory systems. Much of what we know about the evolution and role of the senses should help us understand the evolution and role of the sense of consciousness and of consciousness itself, just as some of what we will learn about the sense of consciousness might help further our understanding of the other senses. Each sense as it emerged during the course of evolutionary history added to the capacity of the organism to &&know'' its internal and/or external environment(s). The sense of consciousness added to the capacity to know the brain and, through that vital capacity, to know the organism along with its other environments. 5.1. EVOLUTIONARY CONTINUITY AND SENSORY COMMONALITIES

Darwin argued that mental functions evolved along with other biological functions. Consciousness might "rst have appeared, according to Eccles (1992) about 200 million years ago through the primitive cerebral cortices of evolving mammals and it was probably limited to representations of the environment based on sensory inputs; consciousness represented immediate sensory impressions. The evolution of consciousness was probably marked by progressive stages in the integration of sensory systems resulting in intermodal representations (Sjolander, 1997). Stein & Meredith (1993) suggest that such merging of the senses might be facilitated by commonalities among modalities that imply a degree of interchangeability. Stimulus features such as intensity, form, number and duration, for example, are common to di!erent modalities and therefore a basis of comparability. Di!erent senses, they point out, can also serve a common function such as escaping from danger or "nding food, which suggests a feature of comparability in addition to those that they mention viz., the relevance or instrumentality of a stimulus to adaptive behavior. The process of selection, as in

386

A. S. TANNENBAUM

the scanning and tracking by the sense of consciousness requires comparison among stimuli in di!erent modalities. Selection might therefore be determined by features of stimuli in the brain such as their intensity, form, number and duration, as well as by their relevance to adaptive behavior, all of which imply something about the &&visibility'' and &&attractiveness'' of the stimuli. The common features of the various senses provide the context for impressive di!erences. Each sense has qualitatively unique e!ects and each can be described in terms of a set of unique parameters. These parameters imply biases; they favor certain stimuli or conditions over others possibly resulting in non-veridical or distorted representations of stimulus objects. Sensory functions have evolved to serve adaptation, not necessarily to provide accurate representations. Dennett (1991), for example, cites the unusual sensitivity of the visual system of many species, "sh as well as humans, to patterns with a vertical (but not a horizontal) axis of symmetry. This perceptual bias probably developed because perceiving such a pattern in primitive environments was often associated with being looked at headon by a predator where an exaggerated emphasis on vertical axis symmetry facilitated the recognition of danger and expedited a speedy getaway. Dennett refers to this result as a trade-o!, of truth and accuracy for speed and economy in escaping danger. The trade-o! implies a kind of deception, although not a deception by one organism of another. The organism, rather, deceives itself, apparently to its own advantage. This bias toward vertical axis symmetry is a relatively "xed and constant feature of the visual sense. I assume that the sense of consciousness, with its complex organizing capability, has biases that are expressed more dynamically in the kinds of stimuli that it is primed to select or to avoid. For example, it might have a built-in, exaggerated sensitivity to stimuli in the brain that leads to an awareness of socially acceptable explanations for one's behavior rather than explanations that might be more valid but less acceptable. It might also have a strong bias against sensing stimuli that would bring into awareness the deception represented by the above rationalization. Like the sense of vision, the sense of consciousness might engage in a trade-o! of truth and accuracy for more urgent

adaptive values. Being convinced of one's own good intentions helps in convincing others that one's intentions are benign even when they are not. The psychodynamic mechanisms of selfdeception*repression, rationalization, reaction formation, projection and others*can be seen as the result of this evolutionary trade-o! (e.g. Nesse & Lloyd, 1992). 5.2. THE UNIQUE PROPERTY OF THE SENSE OF CONSCIOUSNESS

The sense of consciousness, as presently conceived, is an unusually complex and comprehensive sense. It senses a broad array of other senses and processes in the brain including processes generated apart from immediate connection to the organism's external environment and that might yield conscious memories and abstract thoughts as well as imaginary or hallucinatory experiences. The sense of consciousness is nonetheless comparable to other senses although its structural complexity and unique e!ects make the comparability di$cult to conceive. While each sense has qualitatively unique e!ects, consciousness has a perplexing quality that seems to separate it from objects of the physical world. This seemingly ethereal and unique quality might not appear to be so unusual, however, in the context of what we know about sensing. Sensing is like any physical encounter. We never encounter a thing, only an aspect of a thing, and the aspect we encounter depends on what we encounter it with and how. Furthermore, every sensing encounter entails a transformation of energy; the output of the sensing process necessarily di!ers in form from the input. Electricity, for example, might be mechanically sensed in a variety ways, each involving a di!erent set of transformations and each providing information about a di!erent aspect of &&the thing'' sensed. A voltmeter, encountering the circuit &&in parallel'' might sense one aspect, its electromotive force, by transforming electrical energy into magnetic #ux, then into the kinetic energy of the meter's de#ecting needle and "nally into the potential energy of the de#ected needle. An oscilloscope, encountering the circuit &&in series'' might sense another aspect, its alternating current, through a transformation of electrical energy into a sine wave

SENSE OF CONSCIOUSNESS

pattern of light energy emitted from a screen. A human might sense still another aspect biologically by touching moistened "ngers to a pair of electrodes. Here again the aspect sensed and the consequent set of transductions and transformations are special to the sensing process. Although the outcome of each encounter is unique, we take the systematic correlations among them as evidence that each is indicative of an aspect of &&the thing'' we call electricity. The neurological system that is sensed by the sense of consciousness in the brain is &&a thing'' encountered. The encounter, like any encounter is with only an aspect of the thing and the outcome, as with any encounter is peculiar to the means of encounter. We probably encounter aspects of the system technologically through electroencephalographs, positron emission tomography, magnetic resonance imaging, electrode probes and chemical interventions that are indicative of electrical, chemical or other metabolic activities in the brain. Although somewhat ambiguous, correlations between the results of these technological encounters and of our biological encounters encourage the hypothesis that each is indicative of &&the thing'' we experience as consciousness. Consciousness, an e!ect of our biological encounter, appears exotic in the context of the products of our technological encounters but it is not ethereal; its particular organization of energy, I suggest, comprises the substance of our phenomenological world. Although consciousness cannot be created by non-biological means this limitation is not exceptional; very few, even less exotic biological manifestations can be created by non-biological means. Furthermore, the biological sense of consciousness along with its immediate environment, and consciousness itself, is part of the organism's information processing system and is therefore &&in the loop'' a!ecting and being a!ected by the larger system of which it is a part. Overlapping functionally, the sense of consciousness and other senses serve to backup, supplement, and control (i.e. check and verify) one another. The sense of consciousness serves a further adaptive function through scanning and tracking, to organize its environment in the brain. Consciousness itself, which is a property of this organization provides its unique version of pieces of the energetic

387

&&maps'' that are sensed in the brain. Consciousness therefore runs parallel to and is in some degree redundant with those maps. It might thus, in its way, supplement the maps being sensed. In doing so it does not simply add more of the same, but more about the same, enhancing the adaptive power of the information and creating an ampli"ed e!ect that might be experienced as &&paying attention'' or, perhaps, as the imperative, &&pay attention'' (for example, to some of the other senses). Indeed, the supplementation by awareness might be among the determinants that direct information processing and thus behavior itself. 6. Some Implications for Research A primary deduction from the theory is that consciousness, as a property of a sense, should follow in time the neuronal events that are sensed in the brain. This hypothesis is consistent with some interpretations of neurosurgical research by Libet (1978, 1981; Libet et al., 1979). Libet found, for example, that the conscious e!ects of an electrical stimulus to the cortex that might be felt on the back of a hand did not occur unless the stimulation was maintained for a minimum of about 0.5 s. He also found a &&backward masking'': consciousness of a pulse stimulus to the skin of a hand that normally yields a felt sensation was suppressed by a cortical stimulus when the latter was initiated as much as 200 ms after the skin pulse was administered. Libet interpreted these "ndings along with the results of additional interventions as indicating a delay between the time a stimulus to the skin reaches the cortex and consciousness of the stimulus. His conclusions concerning the delay and its theoretical implications, however, are controversial (e.g. Glynn, 1990; Hesslow, 1994). Further studies are necessary before reasonable conclusions can be drawn. The present theory predicts for these studies con"rmation of a temporal gap and suggests other, more speci"c hypotheses that might be explored through methodologies like that illustrated by Libet. The following hypotheses are presented to illustrate avenues of exploration without specifying details that should be taken into account as part of a research program. Experimental controls not speci"ed here, for example, would be

388

A. S. TANNENBAUM

essential to any test. The stimuli are to be understood always as being delivered to a primary sensory area of the cortex where, as the theory proposes, they might be sensed. I have distinguished two aspects of the sensory process, scanning and organizing. First, consider scanning. (a) Scanning. The probability that a stimulus will be detected through scanning is likely to be a function of the &&relevance'' and &&visibility'' of the stimulus as suggested in the previous section. Relevance, I assume, is a function of the modality of the stimulus and its instrumentality to adaptive behavior, given the state of the organism such as its level of hunger or sexual drive. In the case of hunger, for example, stimuli associated with olfaction or taste are likely to be more relevant than stimuli associated with cutaneous sensations. Visibility, on the other hand, might depend on characteristics of a stimulus such as its intensity. Considering relevance "rst, the theory predicts, (1) the delay of consciousness will be shorter for stimuli to olfactory areas in hungry subjects than in sated subjects, (2) in the case of hungry subjects, the delay will be less for stimuli associated with olfaction than for stimuli associated with cutaneous sensations and (3) the above di!erence in delay for hungry subjects will be attenuated in the case of sated subjects. Hypotheses regarding visibility might be explored on the assumption that the visibility of a stimulus is a function of its intensity. The time entailed in scanning should therefore be inversely related to the intensity of a stimulus. Thus, the theory predicts that, (4) the time between a cortical stimulus and consciousness of the stimulus will decrease as stimulus intensity increases from its liminal level. More speci"c variations of this hypothesis might explore the interactive e!ects of the relevance and intensity of a stimulus. For example, (5) to achieve consciousness, the intensity of a stimulus must be greater for less relevant stimuli (e.g. olfactory in sated subjects) than for more relevant stimuli (olfactory in hungry subjects). The e!ects of multiple stimuli might also be studied on the assumption that stimuli compete for attention as a function of their relevance and visibility. The theory predicts that, (6) the introduction of one stimulus will increase the time for a second, simultaneously administered stimulus to reach

consciousness. More speci"cally, the e!ect on the second will be a function of the visibility and relevance of the "rst. Thus, (7) the delay in consciousness for the second will increase with (a) the intensity of the "rst and (b) subjects' level of hunger, if the "rst stimulus is associated with olfactory (but not cutaneous) sensations. Therefore, (8) a strong stimulus might be consciously experienced before a weak one even if the weaker had been administered earlier. Related hypotheses are suggested by the phenomenon of &&backward masking'' as in Libet's research. Thus the theory predicts that, (9) an appropriately timed stimulus to the somatosensory cortex will retroactively mask consciousness of an earlier stimulus to the cortex. (b) Organizing. Hypotheses might also be formulated on the basis of the organizing function of the sense of consciousness. The time involved in organizing might, for example, be a!ected by the form of a stimulus as illustrated by its complexity. A complex stimulus, I assume, implies more information and requires more time to be processed than a simple stimulus; the time between cortical stimulus and consciousness should therefore increase with complexity. As an experimental variable complexity poses de"nitional and operational problems that cannot be entirely solved within the limits of contemporary methods. Future technologies, however, might overcome some of these limits. I therefore suggest the following only to illustrate tentatively a direction in which empirical exploration might go. It might be reasonable to assume, for example, that peripheral senses di!er from one another in complexity and that these di!erences correspond to di!erences in the complexity of the brain activity necessary to process information from these senses. Di!erences should therefore be expected in the delays between consciousness and cortically delivered stimuli associated with di!erent sensory modalities. The theory would predict, for example, (10) a greater delay for stimuli that create a visual image (induced cortically as, e.g. in the research of Pen"eld & Perot, 1963) than for stimuli to areas that elicit tactile sensations. Alternately, it might be reasonable to assume that sensations di!er from one another in complexity within as well as between modalities. The "ne discriminations necessary to fully test hypotheses

389

SENSE OF CONSCIOUSNESS

about intramodal di!erences are probably not possible now although some contemporary methods might provide clues regarding the proposed hypothesis and might, with re"nement, lead to more adequate tests. Libet (1973), for example, describes two kinds of sensory reaction to stimuli delivered to the post-central gyrus: &¶sthesia-like'' and &&natural-like''. The former includes numbness or tingling; the latter includes sensations like a touch or the feeling of a ball rolling over the surface of the skin. Exploratory research employing this distinction might be justi"ed on the assumption that the natural-like sensations are more complex than the parasthesia-like ones. Given this assumption, (11) the stimulus-consciousness delay will be greater for natural than for parasthesia-like sensations. A further avenue of exploration suggests itself regarding the organizing (as well as the scanning) function of the sense of consciousness. The establishment of a conscious sensation has a counterpart in its disestablishment. The present conception implies an asymmetry in the time and/or energy entailed in these respective phases. Establishment entails a time consuming process of construction*scanning, tracking, organizing*that creates a uniquely organized dynamic system out of the disorder of chaos. Disestablishment implies a reversion to chaos, which might occur by an interruption or disruption of the process at any point along the way, and it might occur in di!erent ways. One, in the case of the experimental procedures described above, would be a simple interruption of the stimulus that initiates and maintains a conscious sensation. &&Masking,'' (as in hypothesis 9) suggests another kind of disestablishment. The following illustrates a test involving the "rst since it is more easily managed. Imagine, as an idealized experimental intervention, a cycle composed of a stimulus train that is switched on for 0.5 s and then o! for 0.5 s repeatedly to form an evenly paced square wave. A number of outcomes for consciousness are possible in principle. The stimulus might appear consciously as an evenly paced square wave of on and o! sensations. This would occur if there were no delay either in the establishment or disestablishment of consciousness, or if identical delays were to occur during both phases. On the other hand, the on periods might

appear to be longer than the o! periods. This would occur if consciousness were to become established with relative suddenness compared to a time-consuming &&winding down'' phase. The present theory, however, assumes that the time necessary for the establishment of consciousness after the initiation of an appropriate cortical stimulus will be greater than the time necessary for its disestablishment after an interruption of the stimulus and the theory therefore predicts that, (12) the on periods will appear to be shorter than the o! periods. Testing a hypothesis in the above way might not be possible because of the e!ects one stimulus cycle might have on the neurons during subsequent cycles. An alternative test, however, might begin with a simple train stimulus to the cortex that creates a sensation felt on one hand, which is interrupted simultaneously by the delivery of a second cortical stimulus creating a comparable sensation felt on the other hand. The theory predicts that, (13) subjects will experience a gap between these simultaneous events; the second stimulus will be felt some time after consciousness of the "rst stimulus ends. I do not o!er speci"c hypotheses here involving the procedure of &&masking'' although I think it might have particular utility in exploring the establishment and disestablishment of consciousness. I note only that masking has a counterpart, &&unmasking''*interrupting a masking stimulus thereby bringing into consciousness a stimulus that was experienced only unconsciously. Experimental unmasking might illustrate, at a very simple level, a transition from unconscious to conscious experience. In more complex manifestations, this transition occurs regularly and, in special circumstances is assumed by some theories of psychotherapy to imply the &&insight'' that is essential to the therapy. 7. Philosophic Implications Popper & Eccles (1977) propose three distinct but interactive worlds: a world of physical objects, a world of subjective experiences and a world composed of the products of the human mind such as works of art and science. I assume a sole physical world but suggest three sets of concepts that might describe the contents of this

390

A. S. TANNENBAUM

world. Set one is the in"nite set of concepts that refer to objects and events that might be objectively observed and measured. Set two is the in"nite set that refers to subjective experiences. These experiences, in the present theory, are a physical manifestation but they cannot be observed objectively; the descriptive concepts here are in the language of mentalism. Set three is an in"nite set of concepts that was not established in the evolution of the human capacity to think. Theories intended to explain the ontology of consciousness, I believe, require concepts in this set in addition to those in sets one and two. Consciousness, therefore, will inevitably have a quality that is elusive and perplexing. But consciousness is not alone among physical manifestations in this regard. The quantum rule that a particle can be at two locations at the same time, for example, is also perplexing and for the same reason; concepts that might provide a realistic description of this seemingly impossible happening are not available in set one or two. The subjectivity of consciousness poses an additional, operational problem for research. The conscious experience of one person cannot be duplicated or observed by others, although energetic aspects of that experience might be. This operational limitation along with its conceptual di$culties should not, however, disqualify consciousness as a subject of scienti"c investigation. For one thing, studies demonstrate a high degree of reliability in the measurement (e.g. through verbal reports) of many subjective phenomena and, even behavioristically inclined theorists might argue that very simple manifestations of consciousness can be studied scienti"cally (Uttal, 2000). Behavioral studies and the methods of neuroscience add objective data to the subjective, providing the basis for a conceptual link between objectively and subjectively observable events. Cognitive ethology contributes a further source of data and a conceptual tie into the framework of biological evolution. Conceivably, future technologies will provide indicators of aspects of consciousness that are comparable between humans and other species and that will enable the study of consciousness in these species along with humans (Gri!en, 1992). I have argued that consciousness is a cause of behavior. This is not to say, however, that

consciously experienced mental states such as beliefs, reasons, attitudes, perceptions and the like are entirely apt as characterizations of the causes. Mental states, as we have seen, might be deceptive and misleading as to what is causing what. They are causal, but not necessarily in the way they might seem to be. Furthermore, consciousness does not a!ect behavior in isolation from other causes; it is part of a complex causal synthesis that includes non-conscious processes. I assume that the e!ect of this synthesis is not an e!ect that can be calculated as a sum of its parts, even if we were to know all of the parts. The full and precise e!ects of consciousness, therefore, might not be apparent. Nonetheless, the present theory argues that consciousness does have effects relevant to adaptation. If so, consciousness is more than just a fascinating biological manifestation; it is an important one. Consciousness a!ects the behavior of its beholder and thus the consciousness and behavior of others, and through that the behavior of institutions, governments, technologies, other species, the ecology of the planet, and eventually of other planets ad in,nitum. I think it &&makes sense'' to understand this sensory phenomenon as best we can, if only imperfectly. I would like to thank a number of friends and colleagues who read earlier drafts and o!ered helpful comments: Klaus Bartoelke, Harry Douthit, Martin Gold, John Holland, Robert Kahn, Marcial Losada, Randolph Nesse, Carol Slater, Alfred Sussman and William Uttal. I owe a debt of gratitude to Professor Daniel Katz who, before his death at the age of ninetyfour provided much intellectual stimulation along with the encouragement and support that I came to expect from him over the years. REFERENCES ALLEN, C. & BEKOFF, M. (1997). Species of Mind: ¹he Philosophy and Biology of Cognitive Ethology. Cambridge, MA: The MIT Press. ANDERSON, P. W. (1972). More is di!erent: Broken symmetry and the nature of the hierarchical structure of science. Science. 177, 393}396. ARHEM, P. & LILJENSTROM, H. (1997). On the coevolution of cognition and consciousness. J. theor. Biol. 187, 601}612. BAK, P. & CHEN, K. (1991). Self-organized criticality. Sci. Am. 264, 46}53. BLOMBERG, C. (1994). The physicist's road to theoretical biology and the mind-matter problem. J. theor. Biol. 171, 41}52.

SENSE OF CONSCIOUSNESS

CHURCHLAND, P. M. (1988). Matter and Consciousness. Cambridge, MA: MIT Press. CRICK, F. (1984). Function of the thalamic reticular complex: The searchlight hypothesis. Proc. Natl Acad. Sci. ;.S.A. 81, 4586}4590. CRICK, F. & KOCH, C. (1998). Consciousness and neuroscience. Cerebral Cortex. 8, 97}107. DENNETT, D. C. (1991). Consciousness Explained. Boston: Little Brown and Co. ECCLES, J. C. (1959). Neuron physiology*introduction. In: (Field, J. & Magoun, H. W., Eds) Vol. I, Handbook of Physiology, Section 1: Neurophysiology, pp. 59}74. Washington, DC: American Physiological Society. ECCLES, J. C. (1992). Evolution of consciousness. Proc. Natl Acad. Sci. ;.S.A. 89, 7320}7324. EDELMAN, G. M. (1989). ¹he Remembered Present: A Biological ¹heory of Consciousness. New York: Basic Books. GLYNN, I. M. (1990). Consciousness and time. Nature 348, 477}479. GOLDENFIELD, N. & KADANOFF, L. (1999) Simple lessons from complexity. Science 284, 87}89. GRIFFEN, D. R. (1992). Animal Minds. Chicago, Il: University of Chicago Press. HESSLOW, H. (1994). Will neuroscience explain consciousness? J. theor. Biol. 171, 29}39. HOLLAND, J. H. (1995). Hidden Order; How Adaptation Builds Complexity. Reading, MA: Addison-Wesley Publishing Co. INGVAR, D. H. (1994). The will of the brain: Cerebral correlates of willful acts. J.theor. Biol. 171, 7}12. JAMES, W. (1890). ¹he Principles of Psychology. New York: Henry Holt and Co. (Reprinted, 1950; Dover Publications, Inc.) KAUFFMAN, S. A. (1991). Antichaos and adaptation. Sci. Am. 265, 78}84. KINSBOURNE, M. (1988). Integrated "eld theory of consciousness. In: Consciousness in Contemporary Science. (Marcel, A. J. & Bisiach, E., eds) pp. 239}256. Oxford: Clarendon Press. LIBET, B. (1973). Electrical stimulation of cortex in human subjects, and conscious sensory aspects. In: Handbook of Sensory Physiology.

    391

    prehensive Human Physiology, (Greger, R. & Windhorst U., eds) Vol. 1 pp. 509}531. Berlin: Springer-Verlag. LORENZ, K. (1973). Behind the Mirror, a Search for a Natural History of Human Knowledge. New York: Harcourt Brace Jovanovich. MAINZER, K. (1996). ¹hinking in Complexity: ¹he Complex Dynamics of Matter, Mind, and Mankind. Berlin: New York: Springer. MALINOVSKY, L. (1987). Classi"cation of sensory nerve formations (endings). In: Mechanoreceptors; Development, Structure and Function. (Hnik, P., Soukup, T., Vejsada, R. & Zelena, J. eds) pp. 287}288. New York: Plenum Press. NESSE, R. M. & LLOYD, A. T. (1992). The evolution of psychodynamic mechanisms. In: ¹he Adapted Mind: Evolutionary Psychology and the Generation of Culture. (Barkow, J. H., Cosmides, L. & Tooby, J., eds) pp. 601}624. Oxford: Oxford University Press. NEWMAN, J. (1995). Thalamic contributions to attention and consciousness. Conscious. Cogn. 4, 172}193. ORPWOOD, R. D. (1994). A possible neural mechanism underlying consciousness based on the pattern processing capabilities of pyramidal neurons in the cerebral cortex. J. theor. Biol. 169, 403}418. PENFIELD, W. & PEROT, P. (1963). The brain's record of auditory and visual experience. Brain 86, 595}611. POPPER, K. R. & ECCLES, J. C. (1977). ¹he Self and its Brain. New York: Springer International. POPPER, K. R., LINDAHL, B. I. B. & ARHEM, P. (1993). A discussion of the mind}brain problem. ¹heor. Med. 14, 167}180. ROLAND, P. E. (1994). Obstacles on the road towards a neuroscienti"c theory of mind. J. theor. Biol. 171, 19}28. ROSE, J. E. & MOUNTCASTLE, V. B. (1959). Touch and kinesthesis. In: Handbook of Physiology, Section 1: Neurophysiology. Vol I. (Field, J. Magoun, H. W. & Hall, V. E. eds) pp. 387}430. Washington DC: American Physiological Society. SCHMID-SCHONBEIN, H. (1996). Physiological synergetics: A holistic concept concerning phase jumps in the behavior of driven nonlinear systems. In: Comprehensive Human Physiology, Vol. 1 (Greger, R. & Winhorst, U., eds) pp. 43}67. Berlin: Springer-Verlag. SJOLANDER, S. (1997). On the evolution of reality*some biological prerequisites and evolutionary stages. J. theor. Biol. 187, 595}600. SPERRY, R. W. (1980). Mind}brain interaction: mentalism, yes; dualism, no. Neuroscience 5, 193}206. SPERRY, R. W. (1984). Consciousness, personal identity and the divided brain. Neuropsychologia 22, 661}673. STEIN, B. E. & MEREDITH, M. A. (1993). ¹he Merging of the Senses. Cambridge, MA: The MIT Press. STEPHAN, A. (1998). Varieties of emergence in arti"cial and natural systems. Z. Naturforsch. 53c, 639}656. TONONI, G. & EDELMAN, G. M. (1998). Consciousness and complexity. Neuroscience 282, 1846}1851. UTTAL, W. R. (2000). ¹he =ar between Mentalism and Behaviorism: On the Accessessibility of Mental Processes. Mahwah, NJ: Lawrence Erlbaum Associates, Publishers. WALTER, H. (1998). Emergence and the cognitive neuroscience approach to psychiatry. Z. Naturforsch. 53c, 723}737.