by Herbert A. Simon
tags: complexity, evolution, hierarchy, systems
Let me introduce the topic of evolution with a parable. There once were two watchmakers, named Hora and Tempus, who manufactured very fine watches. Both of them were highly regarded, and the phones in their workshops rang frequently — new customers were constantly calling them. However, Hora prospered, while Tempus became poorer and poorer and finally lost his shop. What was the reason?
The watches the men made consisted of about 1,000 parts each. Tempus had so constructed his that if he had one partly assembled and had to put it down — to answer the phone, say — it immediately fell to pieces and had to be reassembled from the elements. The better the customers liked his watches, the more they phoned him, the more difficult it became for him to find enough uninterrupted time to finish a watch.
The watches that Hora made were no less complex than those of Tempus. But he had designed them so that he could put together subassemblies of about ten elements each. Ten of these subassemblies, again, could be put together into a larger subassembly; and a system of ten of the latter subassemblies constituted the whole watch. Hence, when Hora had to put down a partly assembled watch in order to answer the phone, he lost only a small part of his work, and he assembled his watches in only a fraction of the man-hours it took Tempus.
It is rather easy to make a quantitative analysis of the relative difficulty of the tasks of Tempus and Hora. Suppose the probability that an interruption will occur while a part is being added to an incomplete assembly is p. Then the probability that Tempus can complete a watch he has started without interruption is (1−p)^1000 — a very small number unless p is .001 or less. Each interruption will cost, on the average, the time to assemble 1/p parts (the expected number assembled before interruption). On the other hand, Hora has to complete one hundred eleven sub-assemblies of ten parts each. The probability that he will not be interrupted while completing any one of these is (1−p)^10, and each interruption will cost only about the time required to assemble five parts.
All 4 of Hora's groups assemble in parallel. Locked groups are immune — interruptions only scatter the remaining unlocked parts.
Now if p is about .01 — that is, there is one chance in a hundred that either watchmaker will be interrupted while adding any one part to an assembly — then a straightforward calculation shows that it will take Tempus, on the average, about four thousand times as long to assemble a watch as Hora.
We arrive at the estimate as follows:
Multiplying these three ratios: (1/111) × (100/5) × 20,000 ≈ 4,000.
What lessons can we draw from our parable for biological evolution? Let us interpret a partially completed subassembly of k elementary parts as the coexistence of k parts in a small volume — ignoring their relative orientations. The model assumes that parts are entering the volume at a constant rate, but that there is a constant probability, p, that the part will be dispersed before another is added, unless the assembly reaches a stable state. These assumptions are not particularly realistic. They undoubtedly underestimate the decrease in probability of achieving the assembly with increase in the size of the assembly. Hence the assumptions understate — probably by a large factor — the relative advantage of a hierarchic structure.
Although we cannot, therefore, take the numerical estimate seriously, the lesson for biological evolution is quite clear and direct. The time required for the evolution of a complex form from simple elements depends critically on the numbers and distribution of potential intermediate stable forms. In particular, if there exists a hierarchy of potential stable "subassemblies," with about the same span, s, at each level of the hierarchy, then the time required for a subassembly can be expected to be about the same at each level — that is, proportional to 1/(1−p)^s. The time required for the assembly of a system of n elements will be proportional to log_s(n), that is, to the number of levels in the system. One would say — with more illustrative than literal intent — that the time required for the evolution of multi-celled organisms from single-celled organisms might be of the same order of magnitude as the time required for the evolution of single-celled organisms from macromolecules. The same argument could be applied to the evolution of proteins from amino acids, of molecules from atoms, of atoms from elementary particles.
A whole host of objections to this oversimplified scheme will occur, I am sure, to every working biologist, chemist, and physicist. Before turning to matters I know more about, I shall mention three of these problems, leaving the rest to the attention of the specialists.
First, in spite of the overtones of the watchmaker parable, the theory assumes no teleological mechanism. The complex forms can arise from the simple ones by purely random processes. Direction is provided to the scheme by the stability of the complex forms, once these come into existence. But this is nothing more than survival of the fittest — i.e., of the stable.
Second, not all large systems appear hierarchical. For example, most polymers — e.g., nylon — are simply linear chains of large numbers of identical components, the monomers. However, for present purposes we can simply regard such a structure as a hierarchy with a span of one — the limiting case. For a chain of any length represents a state of relative equilibrium.
Third, the evolution of complex systems from simple elements implies nothing, one way or the other, about the change in entropy of the entire system. If the process absorbs free energy, the complex system will have a smaller entropy than the elements; if it releases free energy, the opposite will be true. The former alternative is the one that holds for most biological systems, and the net inflow of free energy has to be supplied from the sun or some other source if the second law of thermodynamics is not to be violated. For the evolutionary process we are describing, the equilibria of the intermediate states need have only local and not global stability, and they may be stable only in the steady state — that is, as long as there is an external source of free energy that may be drawn upon.
Because organisms are not energetically closed systems, there is no way to deduce the direction, much less the rate, of evolution from classical thermodynamic considerations. All estimates indicate that the amount of entropy involved in the formation of a one-celled biological organism is trivially small. The "improbability" of evolution has nothing to do with this quantity of entropy, which is produced by every bacterial cell every generation. The irrelevance of quantity of information, in this sense, to speed of evolution can also be seen from the fact that exactly as much information is required to "copy" a cell through the reproductive process as to produce the first cell through evolution.
The existence of stable intermediate forms exercises a powerful effect on the evolution of complex forms — one that may be likened to the dramatic effect of catalysts upon reaction rates and steady-state distribution of reaction products in open systems. In neither case does the entropy change provide us with a guide to system behavior.
✦ memory · ☽ night · ∞ loops · ❧ margins · ◆ proof
a personal library in perpetual arrangement · MMXXVI