Review of “From Complexity to Life”

 Niels Henrik Gregersen, (ed.), From Complexity to Life: On the Emergence of Life and Meaning Oxford: Oxford University Press, 2003

©Technological Forecasting and Social Change, 2004

This is an important book, perhaps even a landmark.  It is an outgrowth of a unique symposium in 1999, sponsored by the John Templeton Foundation and convened at the Santa Fe Institute, which brought together an illustrious group of scholars from various disciplines to consider some of the deepest questions in science, metaphysics, and theology.  The participants included cosmologist/physicist Paul Davies, information theorist Gregory Chaitin, quantum physicist and complexity theorist Charles H. Bennett, biophysicist/complexity theorist Stuart Kauffman, biophysicist/biochemist Harold Morowitz, cellular biologist Werner Loewenstein,  mathematician/physicist Ian Stewart, biochemist/theologian Arthur Peacocke, philosopher of science and “intelligent design” advocate, William Dembski, and the distinguished theology-and-science research professor, Niels Gregersen, who also served as the symposium editor.

The scientific community is well aware of the sometimes vehement denunciations of religion by some of their colleagues – Richard Dawkins, Francis Crick, Edward O. Wilson and Steven Weinberg come readily to mind.  Likewise, many scientists are acutely aware of the vociferous attacks on science – and Darwinism in particular – by the Creationists and others.  Biochemist Michael Behe’s Darwin’s Black Box (1996) is an especially notable example.  Much less frequent, or visible, are serious attempts to engage in a science-theology dialogue, one that explicitly seeks a middle-ground – a scientific world view that is compatible with the postulate of intelligent design and a Designer in the universe, on the one hand, and a theological stance that is also consistent with the canons of modern science and the accumulating scientific evidence about evolution.  The Templeton/Santa Fe Institute symposium represented a major effort to explore this middle-ground.

The meta-theoretical “strategy” (as it were) that guided this effort involved the use of complexity theory and what Paul Davies – in a masterful introductory synthesis – calls an “emergentist” world view.  His term refers to the fundamental claim that wholes are more than the sum of their parts; they cannot be derived from the laws of physics alone.  In other words, complexity in the universe is real and not simply an epiphenomenon.  So the fundamental question that has challenged many complexity theorists – and many theologians as well –  is where did all this complexity come from?  And why?  Can a designer God (or perhaps a pantheistic “imminence”) be seen to be at work in this process?

A centerpiece of the symposium was Stuart Kauffman’s vision – as articulated most recently in his book Investigations (2000) – of biological evolution as in inherently self-organizing process.  Life arises spontaneously, he claims, and complexity evolves naturally in accordance with what he provisionally calls the “fourth law of thermodynamics” – an innate tendency of life to explore the “adjacent possible” opportunities for building greater complexity.  Kauffman also posits that this dynamic ultimately leads to the emergence of “autonomous agents” that are “able to act on their own behalf in an environment.”  Kauffman does not deny the role of natural selection in evolution, but he downgrades its importance and assigns it to the role of “fine-tuning” a process that is fundamentally orthogenetic.  Kauffman characterizes it as “order for free,” although he acknowledges that the core – the motor that drives this process – is still “mysterious.”

Needless to say, this vision leaves much room for advancing metaphysical explanations.  And Niels Gregersen, in his concluding chapter (along with a complementary chapter by Arthur Peacocke), provides a sophisticated and compelling argument for the role of intelligent design in evolution.  Gregersen admonishes us to “think of self-organization as the apex of divine purpose….We should see God as continuously creating the world by constituting and supporting self-organizing processes….Self-organizing systems are here seen as prime expressions of God’s continuous creativity.”  Gregersen also sees God as allowing for a degree of “freedom” and emergent “autonomy” in evolution, as Kauffman suggested.

Although this formulation represents a major contribution to the science-theology dialogue, there are also some problems with it.  Some of the problems are rooted in the science of complexity itself, at least as currently constituted.  For instance, how do you define, and measure, complexity?  As Davies concedes in his introduction: “The study of complexity is hampered by the lack of a generally accepted definition.”  Indeed, there are many different ways of defining the term, and (to paraphrase Rudyard Kipling) every single one of them is right.  (See the discussion of this issue in my 1998 article, “Complexity Is Just a Word!”)  Unfortunately, the Templeton/Santa Fe symposium participants were partial to  the definitions that have been developed by physicists, computer scientists, and information theorists, but this is ultimately an unsatisfactory approach to defining biological complexity.

Specifically, Chaitin defines complexity in his chapter as the antithesis of computational randomness.  So algorithmic complexity, as he calls it, can be quantified in terms of the number of bits in a minimal program required to specify a given (ordered) output.  Charles Bennett, in his chapter, takes a somewhat similar approach.  Rejecting the use of functional, “life-like” properties as being too difficult to define and measure, he advances the notion of “logical depth” in a computational process as the most promising approach for developing a general theory of complexity.

The problem is that this approach only works if organisms and their environments (and the interactions between them) can be reduced to a set of computer algorithms.  But in fact, an organism is much more than simply an embodied algorithm.  Even the genome doesn’t work like a computer.  Functional design and “purposive” cybernetic organization and behavior are the most basic and quintessential characteristics of living systems, and any definition of biological complexity that ignores these functional/engineering characteristics is insufficient.  Moreover,  viable alternative definitions are available that are more suited for measuring complexity in living systems.  One is described in Corning and Kline (1998, footnote 6).  It stresses the number of cybernetic feedback processes that are associated with a given system.  Another, complementary approach has been proposed by biologist Eörs Szathmáry and his colleagues (2001).  Their methodology focuses on the number of functional relationships and interactions in a given system.

A related problem in complexity theory, with perhaps more serious implications for the objectives of the Templeton/Santa Fe volume, is the confusion over how to define information.  As Davies notes, information has played a central role in biological evolution.  But what is information, and how do you measure it?  Equally important, how does it arise?  Chaitin defines it in statistical terms; the fundamental “unit” of information is a binary “bit.” Thus, according to Bennett, algorithmic information represents the size in bits of the most compact computer program required to generate a given “object” (say an organism?).  Dembski, following Davies, characterizes living organisms as the embodiment of “specified complexity” (as distinct from purely physical complexity) – which is in turn a product of information.  Yet Dembski – like the information theory pioneer Claude Shannon (and innumerable information theorists over the past 50 years) – defines information in statistical terms as a measure of improbability, even though the examples he uses all have important functional properties as well.

Even more disconcerting is the claim by some physicists, echoed by various theorists in this volume, that information is a physical entity in the universe – like energy – that originated with the Big Bang and must somehow be “conserved.” We are told that information is embedded in energy and that sunlight transfers its information to our eyes.  Other theorists suggest that information is in some way the opposite of “entropy” – i.e., available energy or physical order, depending on how loosely you define the term entropy.  (Ian Stewart details some of the objections to this approach in an appendix.)  Still others equate information with genetic instructions.  But this mechanistic view of how the genome operates is outdated and inapposite.  The burgeoning science of genomics is pursuing a more complex, systemic, feedback-dependent model.

The basic, intractable problem with applying various statistical and information theory formulations to living systems (except in certain limited contexts) was pointed out by the systems scientist Anatol Rapoport almost immediately after Shannon’s seminal paper (which actually referred only to “communications theory” not “information”) was published in 1948.  These approaches are essentially blind to the functional/meaning aspect of biological information (as Shannon himself conceded).  They are therefore of little use in understanding the role that information has played in the origin and evolution of life on Earth.  To illustrate: a single binary bit might be assigned to do nothing more than control the movement of an electron from point A to point B inside a computer.  Yet one of its cousins might serve as the signal to initiate a nuclear war.  In other words, all bits are not created equal.

A hint of an alternative, functional approach to information can be found in Kauffman’s chapter.  Following the argument in Leo Szilard’s legendary critique, Kauffman points out that a key feature of Maxwell’s apocryphal “demon” (the fanciful creature that physicist James Clerk Maxwell invented to illustrate some basic thermodynamic principles) was that the demon utilized purposeful information – “know how” – to extract work.  Szilard’s conclusion was that information is costly to acquire and use and must therefore be included in the thermodynamic calculus for the demon experiment.  In other words, living systems utilize purposeful, cybernetic “control information,” which I define as “the capacity to control the capacity to do work.”  Control information is eminently measurable, but its usefulness is not fixed; it is defined both by the user and the user’s specific environment. (For more details about this concept, see Corning and Kline 1998; also Corning 2001).

The most serious problem with the Templeton/Santa Fe volume, though, has to do with what could perhaps be described as its theoretical “heart.”  Stuart Kauffman’s evolutionary vision is inspiring, but it is also highly speculative.  We are told by Kauffman that life arises spontaneously; it self-organizes orthogenetically; it complexifies in accordance with an inherent “law” of diversification; and it miraculously produces “autonomous agents” (aka organisms or living systems) that go about doing thermodynamic work and reproducing themselves.  Kauffman acknowledges that he presently has no direct evidence for this scenario; it amounts to a promissory note.  But there are also some unacknowledged difficulties.

To be specific, what Kauffman characterizes as a fourth law of thermodynamics – an inherent diversifying tendency – is fully accounted for in mainstream Darwinian terms as an inherent variability in living organisms (and their environments) that is subject at all times to differential survival and reproduction based on context-specific functional criteria.  Darwin characterized it as a law of variation.   As for the postulate of self-organization, the evidence is overwhelming that biological organization is predominately controlled by purposeful control information.  Autonomous self-organizing processes certainly do exist in the natural world.  But, contrary to the widespread assumption that self-organization and natural selection are alternative explanations for complexity in living systems, in fact there is much evidence that they are complementary.  Self-organizing processes may facilitate and introduce economies into the process of constructing living organisms, but the results are always subject to the final editorial “pruning” of natural selection.  Self-organization survives only if it “works” in relation to the ongoing problem of survival and reproduction (on this issue, see Camazine et al., 2001).

Kauffman’s image of “autonomous agents” is also troubling.  As a general rule, living organisms are hardly autonomous.  Their basic “purpose” has been “pre-programmed” by evolution.  They are shaped and constrained by the functional, control information contained in the genome.  They are subject at all times to the inescapable challenges associated with survival and reproduction.  And they are enmeshed in a more or less complex system of interdependencies and feedbacks, both with other organisms and with their environments.  Thus, autonomy in living organisms is a matter of degree, and it is in any case an emergent (functional) product of evolution via natural selection, not a free lunch.   It is subject at all times to differential selection.

What these and other problems with this symposium volume  highlight is the fact that there was a major oversight in the basic plan for the conference.  No fully accredited evolutionary biologist was included on the roster.  As a result, a partisan, and controversial, rendering of the evolutionary process was featured.  Indeed, some caricatures and serious misstatements about Darwinian theory were proffered but not seriously challenged.  To cite a few examples: There is no such thing as “order for free” in evolution.  Biological complexity has “bioeconomic” costs (witness Maxwell’s demon), and these must be offset by equivalent or greater benefits in order for an organism to thrive and reproduce itself.  Complexity is always contingent.

Similarly, natural selection was characterized by some of the conference participants as being an algorithm, or a “rule.”   This is flatly wrong.  In fact, natural selection is an “umbrella” term.  It refers to whatever functional influences – as opposed to fortuitous effects or law-like physical forces – are responsible in a given context for differential survival and reproduction.  What is missing (or certainly muted) in this volume is the ground-zero premise of evolutionary biology – namely, that survival and reproduction is the basic, continuing, inescapable problem for all living organisms.  Therefore, the proximate bioeconomic problem of meeting basic survival and reproductive needs is an ongoing challenge, in the natural world and human societies alike.

Perhaps the most serious misstatement in this volume, however, is Dembski’s claim that Darwinian evolutionary theory is incapable of accounting for biological complexity.  As noted earlier, Dembski, echoing Davies, points out that living organisms are characterized by “specified complexity,” which he asserts cannot be produced with a Darwinian algorithm [sic].  Dembski also adopts the dubious idea (borrowed from the late Stephen Jay Gould)  that there is an inherent tendency toward simplicity in nature.  Dembski concludes that complexity can only arise through an exogenous intelligence.

On the contrary, specified complexity can only mean complexity that is organized by functional, control information, and this form of information is unequivocally a product of natural selection.  Moreover, the so-called “Synergism Hypothesis” posits that it is the functional (adaptive) advantages associated with synergistic effects of various kinds that have been responsible for the progressive evolution of biological complexity over time.  The functional synergies that arise from various forms of organized cooperation in the natural world are often (not always) favored by natural selection.  In effect, the Synergism Hypothesis involves a bioeconomic theory of complexity.  However, biological complexity is also costly and is therefore always contingent; it must pay its way (see Corning 1983, 2003; also Maynard Smith and Szathmáry 1995, 1999).

In the end, what salvages the “case” that this volume seeks to advance is the final chapter by editor Niels Gregersen.  By tacitly adopting a more sophisticated and balanced understanding of evolutionary biology, Gregersen deftly transcends the shortcomings and misconceptions (and even some internal contradictions) that might otherwise have undermined the organizers’ basic objective.  In effect, Gregersen implicitly recognizes the need to accommodate to the Darwinian evolutionary paradigm.  He calls on theologians to “move beyond” their traditional, often stereotyped concepts and models.  The postulate of design at the macro-level (so to speak) need not exclude the possibility of an autonomously creative evolutionary process and even “chance” factors in evolution, he says.  Gregersen stresses particularly the importance of the so-called anthropic principle, namely, the remarkable array of “cosmic coincidences” that are necessary preconditions for life and that fortuitously converged at a particular time and place (or places) in the history of the universe. “Explaining the framework of the world as such [in terms of a Designer] does not always explain the particular features emerging within that framework.”

Gregersen argues that we can understand God as the “creator of creativity” and that God intended to allow living organisms to flourish on their own but in the context of a God-given process.  I find myself quite sympathetic to this view.  It seems to me to be able to accommodate the Darwinian paradigm (properly understood), yet it provides a framework for a true middle-ground position – what might be called a faith-based evolutionary biology.  Perhaps the Templeton Foundation will ultimately be inspired to take on this “ultimate challenge” and sponsor a conference on “Darwinism and Design.”

References
Behe, M. (1996) Darwin’s Black Box: The Biochemical Challenge to Evolution. New York: Simon and Schuster.
Camazine, S.  et al. (2001) Self-Organization in Biological Systems.  Princeton, NJ: Princeton University Press.
Corning, P.A. (1983) The Synergism Hypothesis: A Theory of Progressive Evolution.    New York: McGraw-Hill.
Corning, P.A. (1998) “Complexity is Just a Word!”  Technological Forecasting and Social Change, 58:1-4.
Corning, P.A., and S.J. Kline. (1998) “Thermodynamics, Information and Life Revisited, Part II:  Thermoeconomics and Control Information.”  Systems Research and Behavioral Science, 15:453-482.
Corning, P.A. (2001) “Control Information: The Missing Element in Norbert Wiener’s Cybernetic Paradigm?”  Kybernetes, 30(9/10): 1272-1288.
Corning, P.A. (2003) Nature’s Magic: Synergy in Evolution and the Fate of Humankind.  New York: Cambridge   University Press.
Kauffman, S.A. (2000) Investigations.  New York: Oxford University Press.
Maynard Smith, J., and E. Szathmáry. (1995) The Major Transitions in Evolution.    Oxford: Freeman Press.
Maynard Smith, J., and E. Szathmáry. (1999) The Origins of Life:  From the Birth of Life to the Origin of Language.    Oxford: Oxford University Press.
Szathmáry, E. et al. (2001) “Can Genes Explain Biological Complexity?”  Science, 292:1315-1316.

Category: Publications