Reduction, integration and emergence in biochemical networks

Reduction, integration and emergence in biochemical networks

Biology of the Cell 96 (2004) 719–725 www.elsevier.com/locate/biocell Review Reduction, integration and emergence in biochemical networks Jacques Ri...

287KB Sizes 8 Downloads 60 Views

Biology of the Cell 96 (2004) 719–725 www.elsevier.com/locate/biocell

Review

Reduction, integration and emergence in biochemical networks Jacques Ricard Institut Jacques-Monod, CNRS, Universités Paris 6 et Paris 7, 2, Place Jussieu, 75251 Paris cedex 5, France Received 13 July 2004; accepted 29 July 2004 Available online 07 October 2004

Abstract Most studies of molecular cell biology are based upon a process of decomposition of complex biological systems into their components, followed by the study of these components. The aim of the present paper is to discuss, on a physical basis, the internal logic of this process of reduction. The analysis is performed on simple biological systems, namely protein and metabolic networks. A multi-sited protein that binds two ligands x and y can be considered the simplest possible biochemical network. The organization of this network can be described through a comparison of three systems, i.e. XY, X and Y. X and Y are component sub-systems that collect states xi and yj, respectively, i.e. protein states that have bound either i molecules of x (whether or not these states have also bound y), or j molecules of y (whether or not these states have bound x). XY is a system made up of the specific association of X and Y that collects states xiyj. One can define mean self-informations per node of the network, , and . Reduction of the system XY into its components is possible if, and only if, ,is equal to the sum of and . If is smaller than the sum of and , the system is integrated, for it has less self-information than the set of its components X and Y. It can also occur that , be larger than the sum of and . Hence, the system XY displays negative integration and emergence of self-information relative to its components X and Y. Such a system is defined as complex. Positive or negative integration of the system implies it cannot be reduced to its components. The degree of integration can be measured by a function , called mutual information of integration. In the case of enzyme networks, emergence of self-information is associated with emergence of catalytic activity. Moreover, if the enzyme reaction is part of a metabolic sequence, its mutual information of integration can be increased by an effect of context of this sequence. © 2004 Elsevier SAS. All rights reserved. Keywords: Information; Organization; Networks; Complexity; Emergence; Reduction

1. Introduction Classical science, that is the scientific activities that have sprung up during the XVII century and spread over western countries since then, is based upon several principles expressed by Descartes in two important books entitled “Discours de la Méthode”(Descartes, 1637) and “Règles pour la Direction de l’Esprit”(Descartes, 1628). Among these principles, one is particularly relevant and can be termed a principle of reduction. As expressed by Descartes, this principle is the following: “Le second (principle) de diviser chacune des difficultés que j’examinerais en autant de parcelles qu’il se pourrait et qu’il serait requis pour les mieux résoudre” (Descartes,1637). In recent years, this principle of reduction has often been applied to life sciences. Faced to complex systems, biologists tend to decompose these systems into their components beE-mail address: [email protected] (J. Ricard). 0248-4900/$ - see front matter © 2004 Elsevier SAS. All rights reserved. doi:10.1016/j.biolcel.2004.07.003

fore studying them with the hope this study will bring important insights in the organization and functioning of the initial complex system. This approach is indeed straightforward reductionism. One can wonder, however, whether the study of component sub-systems is sufficient to give valuable information about the functional organization of the overall system itself, for some novel unpredictable properties may possibly emerge as a consequence of the interaction of the components of the system. There is probably no definitive and general answer to these questions. Nevertheless, it is possible to show with extremely simple biological systems that sound reduction is not always possible and it is tempting to consider that this conclusion can be generalized to more complex and realistic biological systems. The aim of the present paper is to show, that simple biochemical networks display properties that cannot be understood from the sole analytical and reductionist approach (Gallagher and Appenzeller, 1999; Hatwell et al., 1999; Jeong et al., 2000; Strogatz, 2001). This analysis will be

720

J. Ricard / Biology of the Cell 96 (2004) 719–725

performed with the simplest protein network i.e. a multi-sited protein that binds, under thermodynamic equilibrium conditions, two ligands. The results obtained will be generalized to a sequence of enzyme reactions. 2. Mathematical description of the organization of simple protein network As mentioned above, the simplest biological network one can conceive is made up of a protein that possesses multiple sites able to bind two ligands x and y under equilibrium conditions. Hence, the protein can exist under different connected states that have bound a different number of molecules of x and y. The states of the protein constitute the nodes of a network whose probabilities of occurrence are functions of the concentrations of the two ligands x and y. For reasons that will appear later, we shall consider two different types of networks. The first one postulates that the protein bears two classes of n identical sites each. The sites of a same class are assumed to be specific for one of the ligands and therefore do not bind the other one. The second type of network is based on the view that the protein has one class of n identical sites that can bind either ligand with different ′ ′ intrinsic binding constant Kx or Ky. Hence, the two ligands compete for the same sites. These two models are shown in Fig. 1A,B. The same reasoning can be applied to a lattice of proteins that change their conformation in succession. This process is now well documented and is called conformational spread (Bray and Duke, 2004). For either model, the nodes are denoted by Nj,k (with j,k{Z+) and their probabilities of occurrence by p(Nj,k). Hence, one has

兺j 兺k p(Nj,k) = 1

(1)

If j = k = 0, the corresponding protein states have bound neither x or y. If j ≥ 1, k = 0, they have bound x but not y, and if j = 0, k ≥ 1 they have bound y but not x. Last, if j ≥ 1, k ≥ 1 the corresponding protein states have bound both x and y. The organization of a network relies upon a probability space, XN, defined as



+

XN = p(Nj,k);j, k { Z , j, k ≤ n



(2)

for the first network, and as



+

XN = p(Nj,k);j, k { Z , j + k ≤ n



(3)

for the second one. Hence, one can define the organization of such a network through the number of nodes that bear both x and y relative to those that bear x (whether or not they also bear y) and those that bear y (whether or not they also bear x). One can define the probability that the protein has bound i molecules of x as p(xi) = 兺 p(Ni,k) k

+

(i { N;k { Z )

(4)

Fig. 1. Two simple protein networks under thermodynamic conditions. (A) For simplicity, the protein is assumed to bear 2 × 5 sites. Each protein state, in thermodynamic equilibrium conditions with the other states, is a node of the network and bears two classes of five sites each. The sites of the first class bind ligand x (lines) and the sites of the second class bind ligand y (columns). Each node Ni,j means that the corresponding state has bound i molecules of ′ x and j molecules of y. The microscopic binding constant Kx is unchanged for ′ all the lines of the model and the microscopic binding constant Ky does not vary for all the columns of the model. Each binding constant is multiplied by a statistical factor whose value declines along the corresponding co-ordinate and reflects the availability of the binding sites. The sites do not interact in such a way there is no co-operativity between them. (B) The protein is assumed to bear five sites that can accommodate either ligand x or y. Hence, there exists competition between the two ligands. The same site binds the ′ ′ ligands x and y with microscopic binding constants Kx and Ky. As for the previous model, the sites do not interact.

and the probability that the protein has bound j molecules of y as p(y j) = 兺 p(Nj,j)

+

(j { N;j { Z )

(5)

j

One can also define the joint probability, p(xi, yj), that the protein has bound i molecules of x and j molecules of y as p(xi, y j) = p(Ni,j)

(i, j { N)

(6)

The variables xi and yj (with i,j{N) are the states of the systems X and Y, respectively. Both of them disregard organization of xi relative of yj, and conversely. Similarly, the states xiyj define the system XY which indeed displays organization of xi relative to yj, and conversely. Hence, the system

J. Ricard / Biology of the Cell 96 (2004) 719–725

721

These expressions can, at first sight, be considered identical to Shannon entropies. In fact they are not. In the expressions of Shannon entropies, the sums of p(xi), p(yj) and p(xi,yj) are all equal to one whereas in expressions of Eq. (9) the sums of the same probabilities are all smaller than one. Whereas, and do not offer any information as to the organization of xi with respect to yj, does. Hence, organization of the network relies upon the nature of the relation that exists between , and .

3. Reduction, integration and emergence in simple protein networks For a simple protein network, the concepts of reduction, integration and emergence describe the various types of organization the network can possess. If = < H(X) > + < H(Y)>

Fig. 2. Systems X, Y and XY as tools for defining organization of simple protein networks. (A) Exactly as for the model of Fig. 1A, the protein is assumed to bear 2 × 5 sites. Systems X and Y collect the values x1, x2, ..., x5 and y1, y2, ..., y5, respectively (see main text). They are devoid of any organization. System XY is organized and gathers the values x1, y1, ..., x2, y2. (B) The protein is assumed to bear five sites only and there is competition of the two ligands for the same sites. The definition of the systems X, Y and XY is the same as for model of Figure A above.

XY can be viewed as being made up of the association of two component sub-systems X and Y (Fig. 2A,B). Probabilities p(xi), p(yj) and p(xi,yj) allow to define three sub-spaces

兵 其 兵 其 XXY = 兵 p(xi, y j);i, j { N 其 XX = p(xi);i { N

XY = p(y j);j { N

(7)

p(N0,0) + 兺 p(xi) + 兺 p(y j) − 兺 兺 p(xi, y j) = 1 i

j

i

(8)

j

By analogy with Shannon communication theory (Shannon, 1948; Shannon and Weaver, 1949; Kullback, 1959; Cover and Thomas, 1991; Yokey, 1992; Adami, 1998; Callager, 1964), one can define the mean self-information (per node of the network) of X, Y and XY by the following functions = − 兺 p(xi)log2p(xi) i

= − 兺 p(y j)log2p(y j) j

= − 兺 兺 p(xi, y j)log2p(xi, y j) i

j

this implies that the self-information (per node of the network) of the XY system is equal to the sum of the mean self-information (per node of the network) of X and Y. Hence, the properties of the system XY can be reduced to the properties of the component sub-systems X and Y. Or, alternatively, if we know the probabilities p(xi) and p(yj), one can predict the values of p(xi,yj). Put in other words, this means that reduction of XY to X and Y is possible if the discrete variables xi and yj are independent. If, alternatively < < H(X) > + < H(Y)>

(9)

(11)

the mean self-information (per node of the network) of XY is smaller than the sum of the self-informations (per node of the network) of X and Y. Hence, reduction is mathematically impossible. Put in other words, this implies that when we associate component sub-systems X and Y the resulting system XY undergoes a decrease of information relative to its components, which measures its degree of integration. Last, if > < H(X) > + < H(Y)>

If p(N0,0) is the probability of occurrence of node N0,0, one has

(10)

(12)

self-information of the system XY becomes larger than that of its components X and Y. This means there is emergence of new information in the system and therefore its degree of integration is negative. Here again, reduction of the properties of the system XY to those of its components X and Y is impossible. In agreement with common sense, the system can be considered complex (Ricard, 1999, 2003). In either of the three cases considered above, the degree of integration of the system can be expressed by the so-called mutual information of integration defined as = < H(X) > + < H(Y) > − < H(X, Y)>

(13)

Indeed is the counterpart, for biochemical networks, of the classical mutual information of communication theory. is equal to zero if reduction is possible, and

722

J. Ricard / Biology of the Cell 96 (2004) 719–725

different from zero otherwise. If is positive, the system XY is integrated and has less mean information (per node of the network) than the set of its components X and Y. Last, if is negative, the system is complex and has more information than its components. In agreement with common sense, complexity is defined by the existence of emergence. This situation can appear unexpected for it violates the so-called sub-additivity principle (Shannon and Weaver, 1949) of classical communication theory which implies that mutual information in a communication process cannot be negative. This is in fact no surprise, for selfinformation, used in this paper, and Shannon entropies are different, which implies that mutual Shannon information and mutual information of integration are different as well. In order to illustrate these ideas of reduction, integration and emergence let us consider the model networks of Figs. 1 and 2. The total “concentration”, or “density”, of protein, NT, is defined, for model 1A, as ′



NT = N0,0(1 + Kxx) (1 + Kyy) n

n



(13) ′

As already mentioned, Kx and Ky are the intrinsic binding constants of x and y to their specific sites. The probabilities p(xi) and p(yj) are

冉冊 n

p(xi) =

i

′ n

(1 + Kx)

冉冊 n

p(y j) =

′i i

Kx x

j

(14)

′j j

Ky y

′ n

(1 + Ky)

and the joint probability p(xi,yj) assumes the form

p(xi, y j) =

冉 冊冉 冊 n

n

i

j

′i i

′j j

Kx x Ky y

′ n

′ n

(1 + Kx) (1 + Ky)

(15)

Comparison of Eqs. (15) and (16) shows that p(xi)p(y j) = p(xi, y j)

冉冊 n

p(xi) =

i

′i i



Kx x (1 + Kyy) ′i

n−i



(1 + Kx x + K yy)

(18)

n

and

冉冊 n

p(y j) =

j

′j j



Ky y (1 + Kxx) ′



n−j

(19)

(1 + Kxx + Kyy)

n

Moreover, the joint probability p(xi,yj) assumes the form

p(xi, y j) =

冉 冊冉 冊 n

n− j

j

i



′i i

j j

Kx x Kyy ′

(1 + Kxx + Kyy)

n

(20)

Hence, this probability is different from the product p(xi)p(yj). In fact x and y are not independent anymore as they compete for the same sites. Hence, reduction is not possible and Eq. (10) does not apply. Depending on the values of x and y, can be either smaller or larger than the sum of and . If x and y are small, one can demonstrate that the mutual information of integration is negative which implies that the self-information of the system XY is larger than the sum of the self-informations of X and Y. It is the converse which is occurring if the values of x and y are large. The system XY is then integrated and its mutual information of integration is positive. Hence, it is striking to note that simple competition between two ligands is sufficient to generate either integration, or emergence, of a protein network. The fact that reduction of system XY to sub-systems X and Y is, from a mathematical viewpoint, usually impossible or, put in other words, the fact that the properties of XY cannot be deduced, in most cases, from those of X and Y has some general implications. Hence, in spite of the fact that the system studied is the simplest possible network under thermodynamic equilibrium conditions, it cannot usually be reduced. One can expect that this conclusion will be reinforced for more complex systems under nonequilibrium conditions.

(16)

which implies that xi and yj are uncorrelated, or independent, and that relation of Eq. (10) applies. Hence, in the case of this system, reduction of the properties of XY to those of X and Y is possible. Put in other words, this means that if we know the values of the probabilities p(xi) and p(yj) we can calculate the probabilities p(xi,yj). In the case of the other model of protein network (Fig. 1B and Fig. 2B) the situation is completely different. The total “concentration” or “density” of nodes is ′ ′ n NT = N0,0(1 + Kxx + Ky)

The probabilities p(xj) and p(yj) are then

(17)

4. Physical significance of the concepts of self-information and mutual information of integration If one assumes that the probabilities of occurrence p(xi), p(yj) and p(xi,yj) are distributed according to Boltzmann statistics (Hill, 1987; Albert and Barabasi, 2002; Barabasi, 2002), one has

兵 其 p(y j) = exp兵 −(E j − E0) ⁄ kBT 其 p(xi, y j) = exp兵 −(Ei,j − E0) ⁄ kBT 其 p(xi) = exp −(Ei − E0) ⁄ kBT

(21)

J. Ricard / Biology of the Cell 96 (2004) 719–725

723

where Ei, Ej, Ei,j and E0 are the free energy levels of the states xi, yj, xi,yj and N0,0, respectively. As usual, kB and T are the Boltzmann constant and the absolute temperature, respectively. Setting e(xi) = Ei − E0 e(y j) = E j − E0

(22)

e(xi, y j) = Ei,j − E0 Fig. 3. Random binding of two substrates Ai and Bi to an enzyme Ei. The system is assumed to be in pseudo-equilibrium. The two substrates bind to ′ ′ the enzyme with binding constants KA, KA, KB and KB. The products of enzyme reaction are Ai+1 and Bi+1.

it follows that = − < log2p(x) > = = − < log2p(y) > =

1 . 4426kBT 1 . 4426kBT

= − < log2p(x, y) > =

(23)

1 . 4426kBT

Here, , and are the mean reduced energy levels of systems X, Y and XY, respectively. From the relationships of Eq. (23) it follows that mean selfinformations of the systems X, Y and XY are proportional to the corresponding mean reduced energy levels of these systems. The same reasoning allows to show that =

+ < e(y) > − < e(x, y)> 1 . 4426kBT

(24)

which implies that emergence will occur if > < e(x) > + < e(y)>

First, what are the thermodynamic conditions that allow emergence, or strong enhancement, of catalysis out of trivial binding properties of the substrates to the enzyme? Second, is there an “effect of context” of the metabolic sequence on the properties of the reaction catalyzed by the enzyme Ei or, put in other words, are properties of enzyme Ei the same whether or not it is included in the metabolic sequence? If, in a sequence of enzyme reactions, there exists a time hierarchy of events that take place in this sequence, in such a way that substrate binding and release are fast events relative to the rate of catalysis and transport of the reaction products to the next enzyme (Fig. 4), one can picture each enzyme reaction by a node of the network. Each node being itself a network, the overall metabolic process can be considered a set of connected networks. Each node i, which is itself a network in quasi-equilibrium, possesses its own mutual information of integration (Fig. 4).

(25)

This situation is expected to take place if part of the kinetic energy involved in the collision of x and y to the protein is used to increase the energy level of the protein states that have bound both x and y.

5. Emergence in enzyme systems The above ideas can be applied to enzyme systems. Let us consider, an enzyme reaction (Ricard, 1999) in a metabolic sequence. The enzyme Ei is assumed to bind randomly two substrates Ai and Bi which are then converted into products Ai+1 and Bi+1. Moreover, it is assumed, as in many enzyme processes, that the rates constants of substrate binding and release are large relative to the rate constant of catalysis. Under these conditions, the system is in quasi-equilibrium (Ricard, 1973). This is precisely why, in Eq. (26) below and in Fig. 3 appear equilibrium binding constants. The products (or one of the products) formed during the reaction are (or is) taken up as substrate(s) by the next enzyme Ei+1 of the metabolic sequence. One can in fact raise two questions.

Fig. 4. Two enzyme reactions in a metabolic sequence. The enzymes are Ei and Ei+1. It is assumed that the processes of substrate binding and release are fast processes relative to the rate of transport of product Ai+1 from enzyme Ei to enzyme Ei+1. For simplicity the catalytic step, which is responsible for the conversion of Ai into Ai+1, is not shown in the Figure. Chemicals Bi, Bi+1, are assumed to be taken up in the medium. Each of the enzyme reactions can be considered a node, Yi or Yi+1, of the metabolic network (lower part of the Figure). The two enzyme processes as two nodes of the network.

724

J. Ricard / Biology of the Cell 96 (2004) 719–725

work, I(Ai:Bi)N, is related to the mutual information of integration of the free enzyme I(Ai:Bi)Ei through the relationship I(Ai : Bi)N = I(Ai : Bi)Ei − log 2p(Y i)

Fig. 5. Thermodynamic conditions that generate emergence of selfinformation in an enzyme reaction. The conditions shown in the Figure are ′ consistent with KA > KA and KB> K′B. Then the height of the energy barrier between the states EiAiBi and Ei[Ai...Bi]≠ is small (see main text).

The expression of mutual information of integration of enzyme Ei considered in isolation is found to be (Ricard, 2003) ′ ′ 1 + KAAi + KBBi + KAKBAiBi K B I(Ai : Bi) = log 2 ′ ′ ′ ′ 1 + KAAi + KBBi + KAKBAiBi KB ′



(26)

KA, KA, KB andKB are equilibrium binding constants of substrates Ai and Bi to the enzyme (see Fig. 3). One can show that ′ I(Ai:Bi) < 0 if KA > KA and KB > K′B. The thermodynamic implication of these inequalities is that the energy level of the ternary complex EiAiBi is higher than those of the binary complexes EiAi and EiBi (Fig. 5). Catalysis takes place at a high rate if the height of the free energy barrier between EiAiBi and the corresponding transition state is as small as possible. Expression of Eq. (26) sets the energy level of the ternary complex EiAiBi and tells us nothing about the energy level of the transition state which is largely defined by the chemical nature of the reaction. Under these conditions catalysis emerges out of trivial binding properties of Ai and Bi to the enzyme Ei. It is striking to note this is precisely the condition required for having I(Ai:Bi)Ei < 0, i.e. the condition for emergence of self-information that generates emergence of catalysis. Now the expression of I(Ai:Bi)Ei is obtained from the probabilities of occurrence of the complexes EiAi, EiBi and EiAiBi. These probabilities of occurrence are computed from the network involving Ei only i.e. from the ith node of the overall network. If we wish to calculate the mutual information of integration of enzyme Ei within the network we have to take into account the probabilities of occurrence of the same complexes EiAi, EiBi and EiAiBi computed over all the possible enzyme states of the entire network. The corresponding probabilities will indeed be much smaller than in the previous case. One can demonstrate that the mutual information of integration of the enzyme Ei within the net-

(27)

where p(Yi) is the probability of occurrence of the ith node (i.e. the ith enzyme reaction) in the overall network. This relationship means that the local information of a node (i.e. of a definite enzyme reaction in the metabolic sequence) is larger if the probability of occurrence of this node is smaller. We find here a classical situation in information theory where the information content of an event is larger when its probability of occurrence is smaller. Hence, –log2 p(Yi) represents the effect of context on the mutual information of integration of enzyme Ei. As p(Yi) depends upon the topology of the network, –log2 p(Yi) expresses how this topology affects mutual information of integration of enzyme Ei. As p(Yi) < 1, the effect of context tends to increase the mutual information of integration of enzyme Ei within a metabolic sequence. 6. Discussion An usual approach of many problems of molecular cell biology consists in decomposing a biological system into its components and studying these components in the hope of understanding the functional organization of the initial system. This hope is far from being often fulfilled. Application of some of the ideas of Shannon communication theory to simple protein networks under equilibrium conditions allows one to describe these networks as an association, according to definite rules, of two systems X and Y. One can define functions, reminiscent but different from Shannon entropies, that express self-information content of these systems as well as of their association XY. Reduction of the system XY to its components X and Y is possible if, and only if, selfinformation of XY is equal to the sum of self-informations of X and Y. Quite often, self-information of XY is smaller than the sum of self-informations of X and Y and the difference between them expresses the degree of integration of the system XY (Tononi et al., 1994, Ricard, 2003). The measure of this degree of integration is effected, thanks to a function called mutual information of integration of the system. In the case of an integrated system, this function adopts positive values, but it can also possess negative ones. This means that upon associating X and Y, the self-information of XY has increased with respect to the sum of the self-informations of X and Y. Hence, owing to the interactions between X and Y there is emergence of information in the XY system. Increase of information is in fact associated with an increase of the energy level of several nodes of the network. Emergence of information and energy occurs in the system if the mean energy level of XY is larger than the sum of the mean energy levels of X and Y. This theory tends to consider biochemical networks as integrated wholes. This is precisely what experimental studies have shown with large multiprotein complexes (Bray and Duke, 2004).

J. Ricard / Biology of the Cell 96 (2004) 719–725

These ideas can be applied to enzyme reactions. If an enzyme binds randomly two substrates it can also possess mutual information of integration that can be positive, negative or nil. If mutual information of integration is negative, this means that the height of the energy barrier that separates the ground from the transition state of the enzyme–substrates complex is small, thus allowing catalysis of chemical reaction between the two substrates to take place. Hence, emergence of self-information is associated with emergence of catalysis. Moreover, if the enzyme reaction is inserted in a metabolic sequence its intrinsic mutual information of integration is enhanced through an effect of context. Most properties discussed above occur for simple networks, under thermodynamic equilibrium conditions. Owing to the highly nonlinear character of the model systems studied, one can expect these systems to display complex dynamics, such as periodic or aperiodic oscillations, if these systems were occurring under nonequilibrium conditions. An ever more general idea that can be drawn from the above results is that biochemical networks contain information. But contrary to the information contained in the genome, information of biochemical and metabolic networks is adaptive, i.e. it can vary in response to external and internal signals.

References Adami, C., 1998. Introduction to Artificial Life. Springer-Verlag Telos, New York. Albert, R., Barabasi, A.L., 2002. Statistical mechanics of complex networks. Rev. Modern Physics 74, 47–97.

725

Barabasi, A.L., 2002. Linked: The New Science of Networks. Perseus Publishing Co, New York. Bray, D., Duke, T., 2004. Conformational spread: the propagation of allosteric states in large multiprotein complexes. Annu. Rev. Biophys. Biomol. Struct. 33, 53–73. Callager, R.G., 1964. Information theory. In: Margenau, H., Murphy, H. (Eds.), The Mathematics of Physics and Chemistry, vol. 2. Van Nostrand, New York, pp. 190–248. Cover, T.M., Thomas, J.A., 1991. Elements of Information Theory. Wiley, New York. Descartes, R., 1628. Règles pour la Direction de l’Esprit. Troisième édition. Vrin, Paris Traduction et notes par J. Sirven, (1959, réédition). Descartes, R., 1637. Discours de la Méthode. Flammarion, Paris (1992, réédition). Gallagher, R., Appenzeller, T., 1999. Beyond reductionism. Science 284, 79. Hatwell, L.H., Hopfield, J.J., Leibler, S., Murray, A.W., 1999. From molecular to modular cell biology. Nature 402 (Suppl), C47–C52. Hill, T.L., 1987. Statistical Mechanics. Principles and Applications. Dover Publications Inc, New York (second printing). Jeong, H., Tombor, B., Albert, R., Oltavi, Z.N., Barabasi, A.L., 2000. The large-scale organization of metabolic networks. Nature 407, 651–654. Kullback, S., 1959. Information Theory and Statistics. Wiley, New York. Ricard, J., 1973. Cinétique et Mécanismes d’action des Enzymes. Doin, Paris. Ricard, J., 1999. Biological Complexity and the Dynamics of Life Processes. Elsevier, Amsterdam. Ricard, J., 2003. What do we mean by biological complexity? C.R. Biologies 326, 133–140. Shannon, C.E., 1948. A mathematical theory of communication. Bell System Tech. J. 27 (379–423), 623–656. Shannon, C.E., Weaver, W., 1949. The Mathematical Theory of Communication. University of Illinois Press, Urbana Il. Strogatz, S.H., 2001. Exploring complex networks. Nature 410, 268–276. Tononi, G., Sporns, O., Edelman, G.M., 1994. A measure of brain complexity: relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 91, 5033–5037. Yokey, H.P., 1992. Information Theory and Molecular Biology. Cambridge University Press, Cambridge.