Complexity (Encyclopedia of Science and Religion)
Whereas cosmology explores the boundaries of the very large, and quantum theory the nature of the very small, complexity theory aims to understand the emergence and development of orders at every level, including the medium size world. To the riddles of the macroscopic and the microscopic are added the puzzles of complex pattern formation in semistable dynamical systems known from everyday life.
Semistable systems are usually nonlinear, so small inputs may trigger dramatic changes. Examples are volcanos and tornados, embryologic and ecological evolution, traffic systems, and stock markets. These are not new areas of research, but the computerization of science since the 1970s has made possible new formalistic approaches to the study of dynamical systems. The question is hereby not so much "What are the constituents of nature (quarks, protons, electrons, etc.)?" but rather "How does nature work?"
Complexity theory, however, is not the name of a single theory comparable to, say, Albert Einstein's Theory of Relativity. There hardly exists one overarching "law of complexity" waiting to be discovered. Rather, complexity research is an umbrella term for a wide variety of studies on pattern formation, some more general, some arising under specific organizational conditions. The field builds on thermodynamics, information theory, cybernetics, evolutionary biology, economics, systems theory, and other disciplines. Since complexity research consistently crosses the boundaries between the inorganic and the organic, the natural and the cultural, it is likely to influence the science-religion dialogue significantly.
There is no consensus on a general definition of complexity. The complex is usually defined in contrast to the simple, but the distinction between simple and complex is a relative one. What is simple in one frame of reference may be complex in another. Walking downstairs, for example, is simple from the perspective of a healthy person, but physiologically it is highly complex. On the other hand, chaos theory shows that complex phenomena can be described by simple nonlinear equations.
An exact measure of algorithmic complexity has been available since the 1960s. In the Kolmogorov-Chaitin model, the complexity of a digital code consisting of 0s and 1s is measured by the length of the computer program needed to describe it. Even a long series of digits (e.g., 01010101010101010101 . . . ) can be compressed into a compact description: "write 01 x times." By contrast, a complex code is a series without a discernable pattern; in the worst case, the series would simply need to be repeated by the computer program (e.g., 1001110010011000011 . . . ). Such systems are by definition random. However, one can never know with certainty whether a series that one sees as random could be further compressed. This is an information-theoretical version of Gödel's incompleteness theorem discovered by Gregory Chaitin.
Similarly, C. H. Bennett suggested a measure for a structure's degree of complexity by referring to its logical depth, defined as the time needed (measured by the number of computational steps) for the shortest possible program to generate that structure. Both Chaitin and Bennett presuppose Claude Shannon's mathematical concept of information: The more disordered a message is, the higher is its informational content. While Chaitin's basic definition has the advantage of being extremely economic, Bennett's definition is capable of measuring the discrete operational steps involved in problem solutions. However, none of these formal definitions of complexity can distinguish organized complexity from pure randomness. The main interest of complexity studies, though, is to understand the self-organized complexity that arises in the creative zones between pure order and pure randomness.
To catch the idea of organized complexity, it may be useful to distinguish between descriptive, causal, and ontological aspects of complexity of natural and social systems. Systems that require different sorts of analyses have been called descriptively complex (Wimsatt, 1976). Fruit flies, for example, require a variety of descriptions, such as physical descriptions of their thermal conductivity, biochemical descriptions of their constitution, morphological descriptions of their anatomical organs, functional descriptions, and so on. This idea of descriptive complexity lends support to an explanatory pluralism, which emphasizes the need for different types of explanation at different levels.
Systems, however, can also be pathway complex while simple in structure if their causal effects are highly sensitive to environmental conditions. A hormone is a natural-kind entity with an easily specifiable molecular composition, but since the effects of the hormone depend on a variety of bodily constellations (which cannot be finitely determined), the causal trajectory of the hormone is complex. Systems theory and organicist proposals in biology have focused on this aspect of complexity.
The most difficult thing to define is ontological complexity. An element-based definition of complexity defines complexity by its large number of variegated elements (Bak, 1997, p. 5). This definition centers on the fact that many large-scale systems (mountains, geological plates, etc.) do not allow for an analytical approach of their microphysical states. A relation-based definition of complexity will rather focus on the multiple couplings of a system in relation to its environments (Luhmann, 1995, p. 238). The human brain with its high number of flexible neurons exemplifies that more possibilities of couplings exist than can be actualized in a life history. Since the capacity for complex interactions with the environment is usually increased by operational subsystems, organizational features can be added to the definition of complexity. An organization-based definition of complexity thus emphasizes the hierarchical structure of interacting levels. Analogously, a performance-based definition focuses on the capacity for self-organizing activity. Systems are thus ontologically complex if they (1) consist of many variegated elements (in terms of sizes and types), (2) have a high capacity for relating to themselves and their environments, (3) are highly organized in subsystems, multilevel structures, and internal programs, and (4) can perform self-organized activities by flexible couplings to the environment. On this scheme it is possible to evaluate different aspects of complexity. A volcano will be more complex than an amoeba on (1) elements and perhaps on (2) relations, but far less complex on the score of (3) hierarchical order and (4) self-organizing activity.
On this scheme, the complex can also be distinguished from the merely complicated (Cilliers, 1998). Even "primitive" natural entities such as genes may be ontologically more complex than sophisticated artificial systems such as airplanes. A Boeing 747 jet consists of highly specified elements, related to one another in predescribed ways, and there exists a clear recipe for how to assemble the elements into a unified system, which again has a predesigned purpose: being able to take off, fly, and land safely. The Boeing is a highly complicated machine but not terribly complex. In this sense, the complex is more than the simple but also more than the complicated.
Complexity studies fall into two main families of research, one more conceptual (organicism, emergentism, and systems theory) and another more formalistic (information theory, cybernetics, and computational complexity). Both types of research continue to interact in understanding complex phenomena. While a conceptual preunderstanding of complex phenomena guides the construction of computational models, these will afterwards have to be tested on real-world situations.
Complexity studies did not start with computers. The idea that the whole is more than the sum of the parts goes back to Plato's notion of divine providence (Laws 903 B-C), and in Critique of Judgment (1790) Immanuel Kant describes a naturalized version of the same idea of self-adjustment: "parts bind themselves mutually into the unity of a whole in such a way that they are mutually cause and effect of one another"(B 292).
Embryologists from Karl Ernst von Baer and Hans Spemann up to C. H. Waddington embraced organicism as the middle course between vitalism and reductionism. In organicism a materialist ontology ("there exists nothing but matter") was combined with the observation that new properties are causally effective within higher-order wholes. Molecules are not semipermeable, but cell membranes are, and without this capacity organisms cannot survive. In the 1920s writers such as C. Lloyd Morgan, C. D. Broad, and Samuel Alexander developed organicism from an empirical research program into a metaphysical program of emergent evolutionism. The point here was that in the course of evolution higher-order levels are formed in which new properties emerge. Whereas the solidity of a table is a mere "resultant" of solid state physics, the evolution of life is ripe with "emergent" properties (for example, metabolism) that require new forms of description and eventually will have real causal feedback effects on the physical system (the atmosphere) that nourished life in the first place.
After World War II the general systems theory combined organicistic intuitions with cybernetics. Ludwig von Bertalanffy replaced the traditional whole/part difference by the difference between systems and environments. Systems are constitutionally open for environmental inputs and are bound to develop beyond equilibrium under selective pressures. Thereby systems theory established itself as a theory combining thermodynamics and evolutionary theory. Systems are structures of dynamic relations, not frozen objects.
In the 1960s Heinz von Foerster and others introduced the theory of self-referential systems according to which all systems relate to their environments by relating to their own internal codes or programs. Brains don't respond to cats in the world, but only to the internal firings of its neurons within the brain. "Click, click is the vocabulary of neural language" (von Foerster). In this perspective, closure is the precondition of openness, not its preclusion. In this vein, biologists Francisco J. Varela and Humberto Maturana developed a constructionist research program of autopoietic (selfproductive) systems. The sociologist Niklas Luhmann has further emphasized how systems proceed by self-differentiation and can no longer be analyzed by reference to global physical features of the world-as-a-whole. In this perspective, each system needs to reduce, by its own internal operations, the complexities produced by other systems. Different systems (for example, biological, social, psychic) operate with different codes (energy, communication, consciousness), and even though they coevolve they cannot communicate with one another on neutral ground. The fleeting experience of consciousness, for example, remains coupled to physiological processes and to social communication, yet has its own irreducible life.
Computational complexity presupposes the idea of algorithmic compression and embodies the spirit of cybernetics. The dictum of Norbert Wiener that "Cybernetics is nothing if it is not mathematical" (1990, p. 88) could also be said of computational complexity.
The field of cybernetics was developed after World War II by John von Neumann, Ross Ashby, Norbert Wiener, and others. Central to cybernetics is the concept of automata, defined as machines open for information input but leading to an output modified by an internal program. In cybernetic learning machines, the output functions are reintroduced into the input function, so that the internal program can be tested via trial and error processes. However, the measure for success or failure is still fixed by preset criteria.
The cybernetic automata were the direct precursors of cellular automata, used in the artificial life models designed by John Conway and Chris Langton in the 1970s. Cellular automata use individual based modeling: "Organisms" are placed in cubic cells in a two-dimensional grid, and their "actions" (die or divide) are specified by the number of living cells in their immediate neighborhood. In this way, the positive feedback of breeding can be modeled as well as the negative feedback of competition. The result is self-reproducing loops generated by very simple rules.
With the establishment of the Santa Fe Institute in New Mexico in 1984 a multidisciplinary center for computational complexity was formed. Physicist Murray Gell-Mann, computer scientist John Holland, and others introduced the idea of complex adaptive systems. As opposed to simple adaptation (as in a thermostat), there are no preset goals for complex adaptive systems. Like cellular automata, complex adaptive systems are individually modeled systems, but complex adaptive systems also involve "cognition." Complex adaptive systems are able to identify perceived regularities and to compress regular inputs into various schemata. Unlike cybernetic learning machines, there may be several different schemata competing simultaneously, thus simulating cognitive selection processes. In this manner self-adaptation coevolves with adaptation beyond a preset design. Whereas Gell-Mann uses complex adaptive systems as a general concept, Holland uses the term only about interacting individual agents. Complex adaptive systems agents thus proceed by a limited set of interaction rules, governed by simple stimulus-response mechanisms such as (1) tags (e.g., if something big, then flee; if something small, then approach), (2) internal models (or schemata), and (3) rules for connecting building blocks to one another (e.g., eyes and mouth to facial recognition). The result of these local mechanisms, however, is the emergence of global properties such as nonlinearity, flow, diversity, and recognition.
Insofar as complex patterns are generated by simple mechanisms, computational complexity can be seen as a reductionist research paradigm; in contradistinction to physical reductionism, however, the reduction is to interaction rules, not to physical entities. But insofar as higher-level systems can be shown to exert a "downwards" feedback influence on lower-level interaction-rules, computational complexity may also count as an antireductionist research program. The issues of reductionism versus antireductionism, bottom-up versus top-down causality, are still debated within the computational complexity community. But anyway, it is information and not physics that matters.
Computational complexity and real-world complexity
The spirit of computational complexity is not to collect empirical evidence and "reflect" reality, but to "generate" reality and explore virtual worlds of possibility. Computational complexity is nonetheless empirically motivated and aims to understand real-world complexity by computer modeling. The aspiration is to uncover deep mathematical structures common to virtual worlds and real-world dynamical systems.
The mathematical chaos theory is an example of a computer-generated science that has succeeded in explaining many dynamical patterns in nature. Yet the relation between chaos theory and computational complexity is disputed. While chaotic systems (in the technical sense) are extremely sensitive to the initial conditions, complex systems are more robust (that is, they can start from different conditions and still end up in almost the same states). Accordingly, chaos theory can predict the immediate next states but not long-term developments, whereas complex systems can reliably describe long-term prospects but cannot predict the immediate following steps. Moreover, chaos systems do not display the kind of evolutionary ascent and learning characteristic of complex adaptive systems, but oscillate or bifurcate endlessly. It therefore seems fair to say that chaos theory is only a small pane in the much larger window of complexity studies. Chaos, in the colloquial sense of disorder, is everywhere in complex systems (and so are fractals and strange attractors), but the equations of chaos in the technical sense (the specific Lyapunov-exponent, etc.) cannot explain self-organized complexity.
There are also connections between thermodynamics and complexity theory. Beginning in the 1960s, the chemist Ilya Prigogine studied the socalled dissipative structures that arise spontaneously in systems dissipated by energy. While classical thermodynamics described isolated systems where nonhomogeneities tend to even out over time, Prigogine studied nonequilibrium processes of "order out of chaos" (chaos in the nontechnical sense). Famous examples are the convection patterns of Bénards cells formed spontaneously under heating or the beautiful chemical clocks of the Belousov-Zhabotinski reaction. While Ludwig Boltzmann's law of entropy from 1865 still holds for the universe as a whole, the formation of local orders is produced by nonequilibrium thermodynamics. The averaging laws of statistical mechanics are not contradicted, but they simply do not explain the specific trajectories that develop beyond thermodynamical equilibrium.
In the wake of Prigogine, a new search for the thermodynamical basis of evolution began (Wicken, 1987). The bifurcation diagrams of Prigogine showed amazing similarities to evolutionary trees. Reaching back to the seminal work of D'Arcy Wentworth Thompson in On Growth and Form (1916), many began to think that the interplay of selection and mutation is not self-sufficient for explaining the evolutionary tendency towards complexification. Evolution may be driven by gene selection and prebiotic laws of physical economy.
Since the 1970s, theoretical biologist Stuart Kauffmann has constructed computational models of self-organizing principles at many levels. Motivated by the almost ubiquitous tendency of chemical systems to produce autocatalytic systems, Kauffman theorizes that life may have emerged quite suddenly through phase transitions where chemical reactions function as catalysts for one another far below the threshold of the RNA-DNA cyclus. Kauffman uses a similar model for simulating the empirical findings of Francois Jacob and Jacques Monod, who showed that genes switch on and off depending on the network in which they are situated. In the simplest model of Boolean networks, each "gene" is coupled randomly to two other genes with only two possible states, on or off (states that are determined by the states of the two other genes). Running this small system with only three genes and two activities recurrently (and later with much larger networks), Kauffman was able to show that the number of state cycles (attractors) increase with the number of genes. Moreover, their relation is constant so that the number of state cycles is roughly the square root of the number of genes, and Kauffman points out in The Origins of Order (1993) that in real-world species one finds roughly the same relation between the number of genes and the number of cell types in a given species. Thus, agents in coupled systems seem to tune themselves to the optimal number of couplings with other agents. In addition, when investigating fitness landscapes of interacting species at the ecological level, Kauffman finds the principle of "order for free." Evolutionary innovations tend to happen "at the edge of chaos," between the strategy of evolutionarily stable orders and the strategy of the constant evolutionary arms race. In Investigations (2000), Kauffman pursues a search for laws by which the biosphere is coconstructed by "autonomous agents" who are able run up against the stream of entropy. Kauffman hereby acknowledges the impossibility of prestating finitely what will come to be within the vast configuration space of the biosphere.
The theory of self-organized criticality formulated by Per Bak and his colleagues starts in empirically confirmed regularities (such as the Gutenberg-Richter law of earthquakes). Many systems show slow variation over long periods, rare catastrophic activities over short time, and some critical phases in between. The building up of sand piles shows these phase transitions, but so do earthquakes, extinction rates, and light from quasars. Bak's point is that self-organized criticality systems are self-organizing, since they (1) are robust and do not depend on specific initial conditions, (2) emerge spontaneously over time (with no external designer), and (3) are governed by the same mathematical principles in stationary, critical, and catastrophic states. Bak has made both real-world experiments and simplified computer-models of self-organized criticality systems, but he believes that self-organized criticality is only a first approximation of stronger explanations of nature's tendency to build up balances between order and disorder.
Relevance for the science-religion discussion.
While organicist programs of noncomputational complexity have played a major role in the science-religion dialogue since the seminal works of Ian Barbour, Arthur Peacocke, and others, the relevance of computational complexity for theology largely remains to be explored. The following issues are therefore to be taken more as pointers than as conclusions:
- The sciences of complexity study pattern formations in the midst of the world rather than in a hidden world beyond imagination. The features of organized complexity resonate with the experiences of being-part-of-a-whole, experiences that since Friedrich Schleiermacher's On Religion (1799) have been taken to be essential to religious intuition.
- While presupposing a robust naturalism, complexity theory suggests that "information" is as seminal to nature as are the substance and energy aspects of matter. Complexity theory may thus give further impetus to the dematerialization of the scientific idea of matter in the wake of relativity theory.
- By focusing on relations and interactions rather than on particular objects, complexity theory supports a shift in worldview from a mechanical clockwork view of the world to an emergentist view of the world as an inter-connected network, where flows of information take precedence over localized entities. Complexity theory also offers a road for understanding the evolution of coevolution. By balancing the principle of individual selection by principles of self-organization, the focus on individual genes is supplemented by the importance of interconnected living organisms, a view closer to ethical and religious sentiments than the inherited view of the omnipotence of selection.
- Even though natural evils (from earthquakes to selection) remain a challenge to religions that presuppose a loving almighty God, the costs of evolutionary creativity are now being placed in a wider framework of evolution. If the same underlying dynamics of self-organized criticality produce both stability, criticality, and catastrophes, and the constructive aspects of nature cannot exist without the destructive aspects, a theodicy of natural evils may be facilitated.
- The idea of complex adaptive systems gives biological learning and human culture (including science, ethics, and religion) a pivotal role in the understanding of what nature is, and what makes human and animal life grow and flourish. In addition, since complexity theory consistently crosses the boundaries between physics, biology, and the cultural sciences, theologians and human scientists may be prompted to rethink human culture (including religion) in terms of the creative interactions between the inorganic, the organic, and the cultural.
- From an external scientific perspective, computational complexity may be used to explain a variety of religious phenomena that arise at the critical interface between adaptation and self-adaptation, such as the interaction between religious groups, individual conversion experiences, and so on. The first computer models in this area have already been completed.
- From an internal religious perspective, complexity theory offers religious thought a new set of thought models and metaphors, which (when adopted) can stimulate the heuristics of theology when complex phenomena are redescribed from the perspective of religious symbolics. Self-organization, coupled networks, and adaptation by self-adaption are candidates for such religious self-interpretation. The principles of complexity are in particular consonant with the idea that a divine Logos is creatively at work in the pattern formations of nature and drives nature towards further complexification.
- The computational complexity idea of self-organization is a challenge to the Enlightenment idea of a divine designer of all natural processes. Self-organization is also a challenge to the creationist Intelligent Design movement, which gives priority to the idea of "original creation" and tends to perceive novelties as perversions of pre-established designs. However, self-organizational processes never happen from scratch, but always presuppose a framework of laws and natural tendencies that could well be said to be "designed" by God. While a design of specific evolutionary outcomes is obsolete in light of self-organized complexity, the coordination of laws leading towards self-organization and coevolution may be explained by a divine metadesign.
- Since emergence takes place in the merging of coupled systems, theology may escape the alternative between an interventionist God, who acts by breaking natural laws, and a God who only sustains the laws of nature uniformly over time. In higher-organized systems, new informational pathways are continuously tried out in adventurelike processes. If the local interaction rules and the overall probability patterns are constantly changed over time, the actual pathways of large-scale coupled systems are not reducible to the general laws of physics. Special divine interaction with the evolving world can thus no longer be said to "break laws" in an interventionist manner, since there are no fixed laws to break in coupled systems.
- The seminal idea of self-organization may help overcome the idea that God and nature are contraries, so that God is powerless, if nature is powerful, and vice versa. A more adequate view may be to understand God as the creator who continuously hands over creativity to nature so that natural processes are the signs of a divine self-divestment into the very heart of nature's creativity. On this view, God is at work "in, with, and under" natural and social processes, and self-organization takes place within a world already created and continuously gifted by God.
See also AUTOMATA, CELLULAR; AUTOPOIESIS; CHAOS THEORY; CYBERNETICS; EMERGENCE; INFORMATION THEORY; INTELLIGENT DESIGN; SYSTEMS THEORY; THERMODYNAMICS, SECOND LAW OF
Ball, Philip. The Self-Made Tapestry: Pattern Formation in Nature. Oxford: Oxford University Press, 1999.
Bak, Per. How Nature Works: The Science of Self-Organized Criticality. Oxford: Oxford University Press, 1997.
Bennett, C. H. "Logical Depth and Physical Complexity." The Universal Turing Machine. A Half-Century Survey, ed. Rolf Herken. Oxford: Oxford University Press, 1987.
Chaitin, Gregory. Algorithmic Information Theory. Cambridge: Cambridge University Press, 1975.
Cilliers, Paul. Complexity and Postmodernism: Understanding Complex Systems. London: Routledge, 1998.
Clayton, Philip. God and Contemporary Science. Grand Rapids, Mich.: Eerdmans, 1997.
Cowan, George A.; Pines, David; and Meltzer, David, eds. Complexity: Metaphors, Models, and Reality. Cambridge, Mass.: Perseus Books, 1994.
Emmeche, Claus. The Garden in the Machine: The Emerging Science of Artificial Life. Princeton, N.J.: Princeton University Press, 1994.
Gell-Mann, Murray. The Quark and the Jaguar: Adventures in the Simple and the Complex. New York: W. H. Freeman, 1994.
Gilbert, Scott F., and Sarkar, Sahotra. "Embracing Complexity: Organicism for the 21st Century." Developmental Dynamics 219 (2000): 1.
Gregersen, Niels Henrik. "The Idea of Creation and the Theory of Autopoetic Processes." Zygon 33, no. 3 (1998): 33367.
Gregersen, Niels Henrik, ed. From Complexity to Life: On the Emergence of Life and Meaning. New York: Oxford University Press, 2002.
Holland, John. Hidden Order: How Adaptation Builds Complexity. Reading, Mass.: Addison-Wesley, 1995.
Holland, John. Emergence: From Chaos to Order. Oxford: Oxford University Press, 1998.
Kauffman, Stuart. The Origins of Order: Self-Organization and Selection in Evolution. New York: Oxford University Press, 1993.
Kauffman, Stuart. At Home in the Universe: The Search for Laws of Self-Organization and Complexity. New York: Oxford University Press, 1995.
Kauffman, Stuart. Investigations. New York: Oxford University Press, 2000.
Luhmann, Niklas. Social Systems, trans. John Bednarz Jr. with Dirk Baecker. Stanford, Calif.: Stanford University Press, 1995.
Maturana, Humberto R., and Varela, Fransisco. The Tree of Knowledge: The Biological Roots of Human Understanding, rev. edition. Boston: Shambala, 1992.
Peacocke, Arthur. Theology for a Scientific Age: Being and Becomingatural, Divine and Human, enlarged edition. London: SCM Press, 1993.
Rasch, William, and Wolfe, Cary, eds. Observing Complexity: Systems Theory and Postmodernity. Minneapolis, Minn.: University of Minnesota Press, 2000.
Russell, Robert John; Murphy, Nancey; and Peacocke, Arthur A., eds. Chaos and Complexity: Scientific Perpectives on Divine Action. Berkeley, Calif.: Center for Theology and the Natural Sciences; Notre Dame, Ind.: Notre Dame University Press, 1995.
Russell, Robert John; Murphy, Nancey; Meyering, Theo; and Arbib, Michael A., eds. Neuroscience and the Human Person: Scientific Perpectives on Divine Action. Berkeley, Calif: Center for Theology and the Natural Sciences; Notre Dame, Ind.: Notre Dame University Press, 1999.
Schmidt, Siegfried J., ed. Der Diskurs des radikalen Konstruktivismus. Frankfurt am Main, Germany: Suhrkamp, 1988.
Solé, Richard, and Goodwin, Brian. Signs of Life: How Complexity Pervades Biology. New York: Basic Books, 2000.
Stengers, Isabelle. La vie et l'artifice: Visage de l'émergence. Paris: La Découverte, 1997.
Thompson, D'Arcy Wentworth. On Growth and Form (1916). New York: Dover, 1992.
Waldrop, M. Mitchell. Complexity: The Emerging Science at the Edge of Order and Chaos. New York: Simon and Schuster, 1992.
Wicken, J. S. Evolution, Information, and Thermodynamics: Extending the Darwinian Paradigm. New York: Oxford University Press, 1987.
Wiener, Norbert God and Golem Inc.: A Comment on Certain Points Where Cybernetics Impinges on Religion. 1964. Reprint, Cambridge, Mass: MIT Press, 1990.
Wimsatt, William C. "Complexity and Organization." In Topics in the Philosophy of Biology, eds. Marjorie Grene and Everett Mendelsohn. Dordrecht, Netherlands: Kluwer, 1986.
NIELS HENRIK GREGERSEN