BioSystems 36 (1595) 7-17
Computational properties of self-reproducing growing automata Rok SosiC*‘, Robert R. Johnsonb ‘School of Computing and Information Technology, Gri/fh University. Nathan, QLD 4111, Australia bDepartment of Computer Science, University of Utah, Salt Luke City, VT 84112, USA Received 15 August 1994; revision received; accepted 20 J&ember
1994
Living organisms perform much better than computers at solving complex, irregular computational tasks, like search and adaptation. Key features of living organisms, identified in the paper as a basis for their success in solving complex problems, are: self-reproduction of cells, flexible framework, and modification. These key features of living organisms are abstracted into a computational model, called growing automata. Growing automata are suited for extremely large computational problems, such as search problems. Growing automata are representatives of soft machines. Soft machines can change their physical structure as opposed to hard machines which have fixed structure. An example of a soft machine is a living organism, an example of a hard machine is an electronic computer. The computational properties of soft and hard machines are analyzed and compared. An analysis of growing automata demonstrates their advantages, as well as their limitations as compared to hard machines. Keywordr: Computational
models; Kinematic automata;
Soft machines; Hard machines; Growing automata;
Self-
reproduction
1. IllmdudoIl
Von Neumann (1966) studied computational structures, the organization of systems of computing elements, as a basis for the design of effective computers. He set foundations for kinematic automata, cellular automata, and self-reproducing automata. The model of kinematic automata is concerned with kinematic and geometric properties of computational elements. The development of
* Corresponding author, E. mail:
[email protected].
kinematic automata has been largely overshadowed by cellular automata where the position and the functionality of computing elements are fixed. An overview of results in kinematic automata theory is presented by Laing (1979). A survey of results in early cellular automata theory has been compiled by Burks (1970). Automata whose computing elements change over time, growing automata, have been introduced by Burks (1961). His discussion focuses on growing automata with a ftxed framework of computational elements. In a fued frumework, elements do not change their positions, but they can be modified. As a result, the connectivity pattern
0303-2647/951SO9.50 0 1995 Elsevier Science Ireland Ltd. All rights reserved SSDI 0303-2647(95)01523-N
8
R Sosit, R. R. Johnson / BioSystems
between computational elements is fixed. Burks mentions growing automata with a flexible framework which permits insertion of new elements between old elements, but the flexible framework is not investigated in any detail. This paper adds the capabilities of a flexible framework and of self-reproduction to the model of growing automata. Each growing automaton consists of many computational elements. The elements are organized in a growing automaton in a way that provides three fundamental properties: self-reproduction, a flexible framework, and modification. Self-reproduction means that computational elements are capable of producing new elements. Ajlexible framework allows the insertion of new elements between old elements. With a flexible framework, positions of elements are not fixed, but they can change during execution. Modification enables the change of elements, which provides a reuse of resources. Since positions of elements are important, an essential part of the description of a growing automaton is its shape in the space. This shape develops by creation, deletion and motion of computational elements. One consequence of self-reproduction ‘is the possibility of a limited exponential growth in the number of computational elements. Another, well established formalism, capable of exponential growth, is the model of L-systems, introduced by Lindenmayer (1968). An extensive bibliography is published by Langton (1989). L-systems are based on the parallel transformation of strings, while growing automata are based on the parallel transformation of computations. Extensions of Lsystems to two and three dimensions are called map L-systems and cellworks. L-systems provide a good formalism for describing the topology of a growing structure, but the shape of an L-system in a space is added by an independent description and it is not directly specified by transformation rules. In contrast, the shape is an inherent property of a growing automaton. This paper introduces self-reproducing growing automata. Computational properties of growing automata are discussed. For very large problems, growing automata might be a more effective computational model than conventional models. Section 2 provides basic definitions and examples.
36 (1995) 7-17
Section 3 discusses self-reproduction, Ilexible framework, and modification as sources of computational efficiency. A hypothetical comparison between living organisms and electronic computers which outlines some trade-offs with growing automata is presented in Section 4. Section 5 concludes the paper.
2. Introduction to growing automata 2.1. Defmition Crowing automata belong to the kinematic automata model as defined by von Neumann (1966). A growing automaton is an automaton that can change its own structure. It consists of elementary computational elements. A computational element includes a structure to carry out the computation and a program that guides the behavior of this element. Elements have the ability to communicate with other elements and to interact with the environment. There is no global synchronization between computational elements. Each computational element of a growing automaton could be, in principle, a Turing machine (Turing, 1937); therefore the expressive power of growing automata is at least equal to electronic computers. A specific feature of growing automata is that an element of a growing automaton can: move itself; or create, delete, change, and move other elements in its neighborhood. As a new element is created, the program, possibly modified, is copied from the parent to the child. Crowing automata could be represented by computational elements, floating in a fluid and moving around freely. The fluid is filled with building blocks. These building blocks provide constituents of computational elements and form the basic level of implementation for growing automata. This is one of the representations suggested for kinematic automata by von Neumann (1966). An example, presented next, illustrates the concept of growing automata. 2.2. Breadth first search with growing automata Breadth first search is a computational tech-
R. SosiE. R. R Johnson / BioSystems 36 (1995) 7- I7
?? ?? 0
0
cl
first step
second step
third step
foutltl step
fifth step
Fig. 1. Generation of new computational elements in breadth first search.
nique of traversing a search tree, by which all branches of a node are visited simultaneously. Although breadth first search seems well suited for parallel computations, the distribution of processes on a parallel electronic computer could be a problem. Since the number of processes can increase exponentially, but the topology of computers is fixed, processors may not be able to distribute processes as fast as they are created. This increase in the number of search processes often leads to throttling, in which processors are overloaded with search processes and no progress is being made in the search. In a growing automaton, the problem could be solved by generating new computational elements as they are required by the search. This generation of new elements is shown in Fig. 1. At the first step, there is only one computational element. This element represents the root node of the search tree. During the second step, this root node produces new computational elements which accomplish the search process at the second level of the tree. In this example, there are three computational elements at the second level. During the third step, computational elements at the second level produce six new computational elements which perform the search at the third level. This process of creating new computational elements continues, until an answer is found. The number of necessary steps to find the answer is proportional to the depth of the search tree. On computers that are not capable of generating new computational elements, the cost of breadth first search depends exponentially on
9
the depth of the search tree, since the search tree must be traversed by a fixed number of processors. By using self-reproducing computational elements, growing automata can reduce the time complexity of a problem. In the breadth first search example, growing automata reduce an exponential time complexity to a linear complexity by traversing all branches at one level in a single step. Due to their ability to produce new computational elements, growing automata can grow exponentially. This exponential growth has ultimate limits in the speed of light. Physical space increases at a polynomial rate of n3 which is not suBicient to accommodate an exponentially increasing growing automaton. The exponential expansion in the number of elements is thus achieved by exponentially increasing the speed of elements and the displacement between them. Since the motion of elements is limited by the speed of light, this speed represents the limit on the exponential growth. This example demonstrates that a growing automaton cannot be separated from its physical shape and the space it occupies. The next section gives a discussion of some possible spaces for growing automata. 2.3. The shape of growing automata Most conventional computational models assume abstract, mathematical spaces, without any consideration of positions of computational elements. An exception are cellular automata which divide the space in regular partitions. In growing automata, space and the shape of an automaton are inseparable from the automaton itself. Growing automata cannot be studied separately from the space they occupy. The shape of a growing automaton is constrained by the topology of the underlying space. The actual shape emerges during the execution. Most common spaces for growing automata are subsets of the three-dimensional Euclidean space, which is for most purposes our physical space. Some examples of space topologies are one, two, three-dimensional Euclidean spaces, circles, or disks with holes. A line is one of the simplest spaces for growing automata. An infinite number of topologically dif-
10
R. Sod& R.R
Johnson / BioSystems
ferent spaces can be derived from the line. Some examples include: a half line, a circle, a star, and their combinations. Each automaton on the line can influence its left or right neighbor. The restriction to two immediate neighbors is a very limiting property of the line space, but nevertheless some useful computations can be made. Examples of computations that can be implemented on a line are divide and conquer algorithms which do not require a large amount of communication between individual computational elements. The connectivity on a line is sufficient when no communication over distance is needed. Although an arbitrary communication path can be simulated on the line by providing each element with message sending and routing capabilities, it is often more effective to embed the automaton in a higher dimensional space. A plane provides more space for connectivity than a line. For example, a tree can be embedded in a plane but not in a line. A family of topologically different spaces can be derived from a plane. Examples are a sphere, a torus, a Mobius strip, and their combinations. Some graphs cannot be embedded in a plane without crossing edges (Harary 1969). This is a limitation of planar spaces. Three-dimensional Euclidean (3-D) space. is the lowest dimensional space that can embed an arbitrary graph. The connectivity between nodes can be arbitrary, but communication distances are still limited by the 3-D space. Three-dimensional space is, for all practical purposes, also the space of our reality. It is the largest space that we can hope to provide for genuine implementations of growing automata. 2.4. Self-reproduction, jlexible framework, and modification This section provides an informal discussion of basic properties of growing automata: selfreproduction, a flexible framework, and moditication. More formal models are presented in the next section. Self-reproduction. Self-reproduction is the ability of computational elements to produce new elements. This ability is crucial in providing exponential growth in the number of computational elements. The relevance of exponential
36 (1995) 7-17
growth is demonstrated in the breadth first search example above. As compared to more traditional models, exponential growth enables the reduction of the time complexity of some algorithms, such as search. In general, self-reproduction can be applied whenever a fast distribution of information to a large number of computational elements is necessary. This fast distribution can be interpreted as an efficient use of time. Flexible framework. An automaton has a flexible framework if the positions of computational elements are not fuced. As new elements are created, old elements are moved around to accommodate these new elements in the structure. The definition of growing automata does not specify how computational elements communicate. As elements are spread apart, they can cease to communicate or they can keep their connections. An example of the first is the breadth first search example. After its creation, each element in the automaton is independent from other elements. An example of a flexible framework, where a complex pattern of connections between elements is retained, is the development of the human nervous system. Flexible framework is of vital importance, since it enables growing automata to adapt themselves efficiently to complex problems. It is shown in the next section that the growth in the size of the automaton is prohibitive without an underlying flexible framework. Modzjicution. Modification enables transformations of elementary computational elements. A transformation of an element involves a partial or complete breakup of the element into basic building blocks and the construction of a new element. Modification provides a reuse of basic building blocks and of space occupied by the automaton. For example, if an automaton requires large memory and a few processors at one time interval and small memory and many processors at another time interval, then modification enables the conversion of memory to processors. This saves the required number of building blocks and space. Self-reproduction, a flexible framework, and modification enable a growing automaton to be efficient in using available resources which include time, space and building blocks.
R. SosiE, RR. Johnson/ BioSystem 36 (1995) 7-17
3. Com~tional maches
efficiency of bard and soft
In this section, hard and soft machines are introduced. It is shown that self-reproduction, flexible framework, and modification are important for the efficiency of soft machines. The discussion concentrates on computational aspects of machines in using basic resources, including space and time. 3.1. Hard and soft machines Hard machines are defined as machines that have a fixed structure. Soft machines are machines that can change their structure. Most artificial devices are hard machines. Examples of hard machines include electronic computers. Examples of soft machines are living organisms. A living organism changes its structure in time. Growing automata abstract computational capabilities of soft machines. 3.2. Self-reproduction and exponential growth In growing automata, elements can produce new elements. This enables, within certain bounds, exponential growth in computational power. The potential of exponential growth and selfreproduction is discussed by Cliff (1980) and Moravec (1988). They show that, given enough time, phenomena with the ability of reproduction can expand much faster than phenomena without this ability. Two ,computational aspects of self-reproduction are im$ortant: the number of active elements and the speed of communication. Using selfreproduction, soft machines can exponentially increase the number of active elements while the speed of communication between elements remains constant. An exponential increase in computational power and a constant speed of communication are mutually exclusive in hard machines. The breadth first search example is used to illustrate the concept of active elements. At the beginning, the root element contains all the information’that is needed to perform the search. This information, slightly modified at each level, must be sent to elements on lower levels of the tree. Ele-
11
ments that have already received the information are called active elements. The search process can be sped up by increasing the number of active elements as fast as possible. With self-reproduction in soft machines, the number of active elements can grow exponentially. This number of elements grows only polynomially for hard machines. The argument is formalized in a comparison between growing automata as representatives of soft machines, and cellular automata as representatives of hard machines. Cellular automata are a model for massively parallel computations. They consist of cells on a uniform grid. Cells have the same functionality. Since the positions of cells in cellular automata are fixed, new cells cannot be created. At each step, a cell in a cellular automaton can communicate with its immediate neighbors (von Neumann 1966). Assuming a three-dimensional square grid of cells and the Moore neighborhood (Toffoli 1987), which includes the neighbors on parallels and diagonals, the number of active elements in n steps, CA,, is at most: CA” = (2n + 1)3 The value of CA, increases polynomially with increasing n. Assuming that each element of a growing automaton can produce two new elements at each step, the number of active elements for a growing automaton at step n, GA,, is calculated as: GA,=2”-
1.
The value of GA, increases exponentially with increasing n. A comparison between CA,, and GA, reveals that self-reproduction in GA,, offers an enormous increase in the number of active elements over CA,, which does not have the ability to self-reproduce (see Table 1). This advantage becomes larger as the problem size increases. A direct comparison between CA,, and GA, does not take into account that one step of growing automata is significantly more expensive than a step of cellular automata. A step in growing automata involves production of new computational elements, while a step in cellular automata
R. Sod& RR Johnson/ BioSystems 36 (1995) 7-17
12
Table 1 The number of active elements at step n in: Cellular automata without self-reproduction, CA(n); and growing automata with selfreproduction, GA(n) n
10
20
30
40
50
60
CA(n) GA(n)
= 4 x 102 = 103
= 2 x 103 = 10”
= 4 x 103 (1 109
= 6 x lo3 = 10’2
= 104 = 10”
a 104 5 10’8
The crossover point is at 59 552 s:
involves only communication. These differences in step complexity are accounted for in the following comparison, which compares the speed of information transfer between living organisms and electronic computers. The comparison correlates cell reproduction with information transfer. In living organisms, the reproduction of small cells takes around 1000 s and transfers at least 10’ bits of information encoded in DNA. Therefore, 1000 s can be taken as sufficient time for a growing automaton to perform one step with 10’ bits as the amount of information to be transferred. Assuming that cellular automata can communicate at the rate of lo8 bits per second, it takes them 10-l s to communicate 10’ bits. Therefore, cellular automata can perform ten steps in 1 s. Based on CA, and GA,, the number of active elements at time t can be calculated for cellular automata, CA(f), and for growing automata, GA(t), as follows:
GA(59 552) = CA(59 552) = 1.69 x 1018
CA(t) = (2 x (10 x t) + l)j, GA(t) = 2”‘Ooo- 1. Table 2 shows GA(t) and CA(t). The number of active elements in GA(z) exceeds the number of active elements in CA(?) between lo4 s and lo5 s.
After this crossover point, GA(f) rapidly surpasses CA(t). It follows that GA(r) might be more powerful for computations, requiring more than lOI operations. Complex problems, such as search, typically require more than lOi operations. Growing automata might not be effective for problems significantly smaller than 1018 operations, because the overhead of reproduction is too high. Besides exponential growth in active elements, an important consequence of self-reproduction is the constant speed of communication between elements. The breadth first search is used as an example again. To simplify the presentation, it is assumed that the breadth first search is executed on a line and that each element creates exactly two new elements. This simplification does not influence conclusions. The first four computational steps are shown in Fig. 2. Each line of squares represents the automaton at a different step in time. Newly created elements are shown in white, the elements from the previous step are shown in black. Since the information is passed from old elements to new, new elements are created as neighbors of old elements.
Table 2 The number of active elements at time t in: Cellular automata without self-reproduction, CA(t); and growing automata with selfreproduction, GA(t)
time
1s
10s
10%
10%
10%
10%
CA(I) GA(t)
- 104 1
- 10’ 1
= 10’0 1
= 10’3 2
* 10’6 2 x 103
= 10’9 2 x 10M
R. SosiE. R.R. Johnson / BioSystems 34 (1995) 7-17
V sscond
;t.
step third step
fouml step
Fig. 2. Exponentially growing automaton on a line.
It follows that communication between old and new elements is local and over a fixed distance, regardless of the number of elements. This is possible because new elements are constructed as needed and because the positions of old elements are not fixed. An exponentially expanding size of the automaton is achieved without the corresponding increase in the speed of communication. If an automaton needs to expand in order to employ massively parallel computation, such as during search, self-reproduction of computational elements offers the fastest way to generate new elements. Since self-reproduction enables a quick employment of a large number of elements, it provides efflcietit use of time. 3.3. Flexible framework Burks (1961) mentions growing automata where new elements can be created between old elements, but this direction of research has not been pursued any further. Burks discusses growing automata with a fixed framework, where the positions of elements are fixed. Hard machines have a fixed framework. Automata with a flexible framework are defined as automata that can move computational elements apart and create new elements in the space in between. Flexible framework is supported in soft machines. The importance of a flexible framework increases with the complexity of problems. For simple problems, effG5ent machines can be designed with a fixed structure. But in complex problems, an example of which is adaptation or search, new requirements are imposed on machines. If computational efficiency is to be retained for complex problems, then the structure of the automaton must be updated to accommodate new require-
13
ments. Updates are achieved by changing existing computational elements or by adding new elements. It is shown below that when new elements must be dynamically added to the structure, automata with a flexible framework can evolve in less space than automata with a fixed framework. The space chosen for the comparison is the plane. The following assumptions are made: an automaton is represented as a circular disk (see Fig. 3 (a)); each new computational element is added on the connection between two elements that are already connected; and the area of the disk is increased in proportion to the length of the new connections. Since the size of connections grows at least as fast as the size of computational elements, the size of elements can be ignorti in the model without significantly influencing the results. Informally, since new elements in a fixed framework must be added on the bord#r of the automaton, the increase in size is dependent on the total size of the automaton. In a flexible framework, elements can be added to the middle of the automaton and the increase in the size is only constant. The result is much faster gtowth in the size of connections in a fixed framework. A more formal argument goes as follows. An automaton with a fixed framework can provide only a limited number of empty slots inside the disk. Once these are occupied, new elemebts must be added on the border of the disk (see Fig. 3 (b)). The length of new connections is thus dependent on the size of the disk. Let FX(n) denote the area of an automaton with a fmed framework at step n. The area KY(n) of the disk can be computed recursively as: FX(0) = a, FX(n+l) = FX(n) + b r(n),
(a)
originalcomputatii
(b)
new elemml in s Rued framework
(1)
(c) new **franI in * fiexfbkl hamwork
Fig. 3. The addition of a new element to the automaton.
14
R. Sod, RR. Johnson/ BioSystems 36 (1995) 7-17
where function H(n) is the area at step n, function r(n) is the disk radius at step n, constant a is the area at step 0, and b relates the disk radius and the average length of new connections. Since the length between an average point inside the disk and the disk border depends only on the size of the disk, b is a constant. With substitutions r(n) = wand c=m Eq. 1 is transformed to: FX(n + 1) = H(n)
+ c _).
The solution to this recurrence equation for FX is: FX(n) = 0.25c2n2 + O(n). The term O(n) denotes a part of the solution that depends linearly on n. It can be seen that the area of an automaton with a fixed framework increases with the square of the number of elements. A flexible framework permits an infinite expansion of the disk interior, because elements can be moved apart to make more space. Since the connection between two old elements already exists, an addition of a new element increases the length of this connection. The increase in the length of the existing connection is constant and independent of the disk size (see Fig. 3 (c)). Let FL(n) denote the area of an automaton with a flexible framework at step n. The equations for the size of the automaton are: FL(O) = a, FL@ + 1) = FL(n) + d,
(2)
where FL(n) is the area at step n, a is the area at step 0, and d is the constant increase in size at each step. The solution to the recurrence equation for FL(n) is: FL(n) = a + dn. The area of the automaton with a flexible framework increases only linearly with the number of elements. The conclusion is that an automaton with a flexible framework FL(n) grows linearly with the number of elements n, while an automaton with a fixed framework FX(n) grows quadratically in n. A
human brain grows from a single cell or from a small set of cells. Until the brain stops growing, new elements, neurons, are created and added to already existing structures. Assume that an estimate for the number of neurons in a human brain, roughly 10” (Moravec, 1988) is taken as the number of elements, n, in a growing automaton. In this case of 10” computational elements, the ratio in space utilization in plane between a flexible and fixed framework is 10”. Calculating the size of an automaton that grows from a single element to 10” elements, the difference between a flexible framework and a fixed framework is the difference between the square with a side of 1 m and the square with a side of wrn = 316 km. The comparison of the sizes shows that large scale evolving automata are possible only within a flexible framework. Existing computers have a fixed framework. In contrast to computers, living organisms have a flexible framework. Computers can, in principle, solve. a fixed or well specified problem, such as matrix multiplication, with the same or greater efficiency as living organisms. But living organisms are solving problems that are not well specified. For these problems, automata with a flexible framework are more effective than automata with a fixed framework. The size of automata with a fixed framework increases too rapidly as their computing structure expands to accommodate new requirements. 3.4. Modification Space and building blocks can be reused by modification, which is defined as the ability of an automaton to change its computational elements. Modification enables different tasks to be accomplished in the same space and using the same building blocks at different times. Since complex problems are usually irregular, large numbers of computational elements are not used during certain phases of problem solving. By reusing this space for other purposes, modification allows a better space utilization. Suppose that an automaton needs a large memory during one time period and many processors during another. If the structure is able to change and both the memory and processors are approximately the same size, then the space can be
R. Sosi2. R. R. Johnson / BioSystems 36 (1995) 7-17
-B
time
-
(a) no adaptation, space is wasted
aotive not used
15
al elements are connected, only their size is considered. Without modification, the maximum possible use of each element type must be guaranteed for optimal execution time: SNM(f) = Fmaxg(t)).
lllMmoy(+
Iprowssors(
??
time (b) adaptation, space is fully utilized
Fig. 4. Modification better utilizes space.
Since SNM(r) is a constant for all I, 1 can be omitted: SNM(t) = SNM. It is obvious that for any value of t: 2X4(t) I SNM.
reused. Fig. 4 (a) shows a computational structure without modification. When only memory is active, processors are not used at all, but they occupy precious resources, such as space and building blocks. Similarly, when processors are active, memory is idle. Fig. 4 (b) shows a computational structure with modification. When the memory is not needed, its structure could be disassembled and processors could be built, using the same space and possibly the same building blocks. So an automaton which is unable to change its elements, in general, occupies more space and uses more building blocks than necessary. In more formal terms, assume i different types of computational elements are needed. Let f(t) denote space requirements for the i-th type of elements at time t. Let SM(t) denote the space requirements of an automaton with modification at time t. Let SNM(t) denote the space requirements of an automaton without modification at time t. It is assumed that transformations from one type of element to a different type can be done in parallel with the computation, and that the transformations do not influence the total execution time. This assumption provides an upper bound for the effectiveness of modification. To achieve the optimal execution time with modification, enough space must be provided to cover demands at any moment t:
Equations do not take into account how individu-
The equality is achieved only for values of t where all f,(r) simultaneously reach their maximum, which is very improbable for large and complex problems. The value of SNM increases rapidly with the increasing number of element types and the complexity of the problem, while the function SM(f) grows at a much slower rate. For an irregular problem, an automaton that can modify its elements generally needs significantly less space than an automaton without modification. 4. Living organisms and electmdc conqmters The model of growing automata has some inherent limitations and trade-offs, which are significant for potential implementations. An important parameter is the time required to create new elements. In order to gain some perspective on this parameter, a hypothetical comparison is made between living organisms, as representatives of growing automata, and supercomputers of the early 199Os, as representatives of electronic computers. The comparison involves time and space trade-offs for problems of different sixes. It is assumed that the problem consists of a large collection of independent operations. Because such problems represent the optimal case for growing automata, they provide an upper bound on the effectiveness of growing automata. Search problems are examples of problems which can be carried out by large collections of independent automata.
16
R. SosiE, RR. Johnson / BioSystems 36 (1995) 7-I 7
It is estimated that an electronic computer performs lo9 operations/s and fits into 0.09 m3 of space - approximately 1 ft3. Let n denote the number of necessary operations to solve a problem. The electronic computer can then solve the problem in n10S9 seconds. A growing automaton can double the number of elements in each time step, so it needs log n steps to produce n elements. Because most of the time is spent in producing new elements, the computational time required to carry out the final computation can be neglected. The total number of steps for n operations is thus log n. To be competitive with the electronic computer, each step must be performed in not more than n/log n 10m9s. The required number of elements is n, i.e. one element for each operation. In order to fit in the same space as the electronic computer, each element can take up at most 0.09/n m3. Results for problem sizes of 109, 1012, lOi’, and lo’* operations are calculated in Table 3. The number of operations column shows the total number of operations, n, which are required to solve the problem. The number of steps column shows the number of steps, log n, which must be performed by the growing automaton to execute n operations. The total time column shows the time needed by the electronic computer to perform the task: n10W9s. The time per step column shows the time needed by the growing automaton to execute one step: n/log n 10S9 s. The size column shows the cube side of one element of the growing automaton, assuming that each element occupies a cube. The numbers in Table 3 provide an estimate for the performance of growing automata to equal electronic computers in execution speed. The most
interesting column is time per step, which gives the time available for the creation of new elements. It can be seen that this time increases rapidly with the size of the problem. It can be concluded that, with the increasing problem size, time constraints considerably favor growing automata over electronic computers. It is interesting to compare the numbers in Table 3 to cells in living organisms. Data are taken from Dyson (1978). The size of cells range from only a few tenths of a pm (smallest bacteria) to many cm (certain marine algae and various bird eggs). Most human cells have a diameter around 10 pm. The rate of cell growth varies between 20 minutes and 1 day, approximately, for one cell division. The row that corresponds to these numbers in Table 3 is the row with around lOi operations. It follows that an implementation of growing automata with cells similar to cells in living organisms would be better than electronic computers at tasks that require at least 1015 operations. These tasks are on the border of existing computing technology. The estimate is similar to the crossover point of lo’* operations, calculated in the section on selfreproduction, but using different assumptions. 5. Conclusion This paper analyzes computational aspects of self-reproducing growing automata with a flexible framework. It is shown that the model of growing automata provides a way to radically increase computational power for complex problems, such as search. An implementation of growing automata requires technology that supports changes in computational structure, especially exponential
Table 3 Growing automata with the capability and the size of an electronic computer
n
Wofsteps log n
total time n10-9 s
time per step total time/# of steps s
size w?LiiG&
109 10’2 10’5 lo’s
30 40 50 60
1s 1000s 218 h 31.7 years
0.033 s 25 s 5.6 h 193 days
45ow 45 Pm 4.5 /ml 0.45 pm
I of ops.
R. SosiZ, R.R. Johnson/BioSystems 36 (1995) 7-17
growth through self-reproduction, a flexible framework, and the modification of computational elements. Living organisms are proof of the existence of structures that: are capable of limited exponential growth; provide a flexible framework; and can partially modify their structure, as individuals and through evolution. Existing electronic computer technology does not allow us to implement genuine growing automata. Capabilities to realize growing automata might be provided by molecular computers. A discussion on design issues of molecular computers is provided in Conrad (1985, 1992). Growing automata are representatives of soft machines. One of the leading trends in technology, called nanotechnology, is toward miniaturization, Drexler (1986). Growing automata and soft machines provide a complementary approach to nanotechnology. Our analysis shows that, together with miniaturization, the ability to change computational structure is another important feature of computational effrciency. The study of soft machines and their computational aspects will lead to computers that are very different from those currently in use. Soft machines will enable us to solve complex problems that are beyond our reach today. References Burks, A.W., 1961, Computation, behavior, and structure in fixed and growing automata, Behav. Sci. 6, 5-22.
17
Burks, A.W. (ed.), 1970, Essays on Cellular Automata (University of Illinois Press, Urbana, IL). Cliff, R., Freitas, R., Laing, R. and von Tiesenhausen, G., 1980, Replicating systems concepts: Self-replicating lunar factory and demonstration, in: Advanced automation for space missions, R. Freitas and W.P. Gilbreath (eds.) (NASA/ASEE Conference, Santa Clara, CA, Publication 2255) pp. 189-335. Conrad, M., 1985,On design principles for a molecular computer. Commun. ACM 28 (May), 464480. Conrad, M., (ed.), 1992, Special issue on molecular computing. IEEE Comput. 25 (November). Drexier, K.E., 1986, Engines of Creation (Anchor Press, Doubleday). Dyson, R.D., 1978, Cell Biology, A Molecular Approach (Allyn and Bacon). Harary, F., 1969, Graph Theory (Addison-Wesley). Laing, R., 1979, Machines as organisms: An exploration of the relevance of recent results. BioSystems 11,201-215. Langton, C.G. (ed.), 1989, Artificial Life, The Proceedings of an Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems (Addison-Wesley, Los Alamos, New Mexico, 1987). Lindenmayer, A., 1968, Mathematical models for cellular interaction in development, Parts I and II. J. Theor. Biol. 18, 280-315. Moravec, H., 1988, Mind Children (Harvard University Press). Toffoli, T. and Margolus, N., 1987, Cellular Automata Machines (MIT Press). Turing, A.M., 1937, On computable numbers, with an application to the entscheidungsproblem. Proc. Lond. Math. Sot. Second Ser. 42, 230-265. von Neumann, J., 1966, Theory of Self-Reproducing Automata (University of Illinois Press, Urbana, IL).