Philip Clayton and Paul Davies: the re-emergence of emergence
Philip Clayton and Paul Davies, ed., The Re-Emergence of Emergence: The Emergentist Hypothesis from Science to Religion (Oxford: Oxford University Press, 2006), 344 pp. (review copy courtesy of OUP)
A guest-review by Ross McKenzie
The concept of emergence in science has been attracting considerable attention recently. In particular, several books have appeared that are oriented towards a popular audience, including:
- J. Holland, Emergence: From Chaos to Order (2000)
- S. Johnson, Emergence: The Connected Lives of Ants, Brains, Cities, and Software
- (2002)
- H. Morowitz, The Emergence of Everything: How the World Became Complex (2002)
- R. B. Laughlin, A Different Universe: Reinventing Physics from the Bottom Down (2005).
So, what is emergence? It depends who you ask. Broadly, it’s the idea that the whole is greater than the sum of the parts. For example, the carbon atoms in diamond and in graphite (lead pencils) are identical. Both materials have identical building blocks. But graphite is black and soft, while diamond is hard and transparent. Properties such as colour and hardness cannot be ascribed to individual atoms. Rather, these properties are “emergent”; they are properties of collections of atoms. Consciousness is often given as the ultimate example of an emergent phenomenon.
What, then, is the “emergentist hypothesis”? It is a form of strong emergence which first arose in philosophy and is sometimes equated with vitalism. It claims that there are emergent phenomena such as consciousness which cannot be reduced to nor understood in terms of lower level phenomena. Hence, mind and brain are two distinct entities. In his excellent introduction to this volume, Philip Clayton describes four key features of this emergentist hypothesis: ontological physicalism, property emergence, the irreducibility of emergence, and downward causation.
Weak emergence, on the other hand, is a much milder position. Many scientists (especially in biology, chemistry, and condensed matter physics) would support this position. Essentially, they acknowledge that collective systems have emergent properties that cannot be reduced purely to properties of lower leves. Furthermore, although the principles (e.g., symmetry breaking) which describe these emergent phenomena can be deduced from theories describing the constituents, these principles are in practice virtually impossible to deduce or predict from the lower level theories. Hence, scientific progress is made from the top down. For example, progress in theoretical chemistry is made by formulating emergent principles and concepts (such as aromaticity and electronegativity) from chemical experiments and then seeing how such principles might follow from the laws of quantum physics.
Although this volume conveniently brings together work from a diverse range of disciplines, it’s unfortunate that the section on theology does not really present a broad range of perspectives. All three authors in this section (Peacocke, Gregerson, Clayton) offer a perspective that is deeply influenced by process thought, and their concern is mostly with philosophical issues rather than with the actual content of theology. For a different perspective, readers might find it helpful to supplement these essays with Alister McGrath’s account in his recent book, The Order of Things: Explorations in Scientific Theology (2006), chapter 5.
Readers may also find it helpful to consider work on emergence by scientists such as Laughlin, Anderson and Hoffmann, as well as other work on emergence in condensed matter physics and chemistry. Such work is not explored in this volume, but I believe it holds significant promise for the dialogue between science and theology.
10 Comments:
sweet.
Nancy Murphy, of course, has some pertinent discussions in considerations of the mind/body/soul problem.
you are loading up my booklist faster than credit card can breathe. :)
Yeah, Nancey Murphy is great - and Philip Clayton's Mind and Emergence: From Quantum to Consciousness (2006) is very good.
Hi - thanks for the review. The idea of 'downward causation' is really interesting, though I've never quite decided if it actually makes sense. It would be interesting to know if there is any strong scientific basis for some kind of 'downward causation'.
I can't speak to "downward causation" per se, but certain features of quantum mechanics manifest the limitations of a gross conception of causation defined by stuff pushing-pulling against stuff. The delayed-choice experiment and other two-slit interference experiments (e.g., Englert, Scully, and Walther), and EPR-type experiments such as Hardy's would be ones to consider. This does not mean, however, that QM should *ever* be used as validation for any of the boutique spiritualities of third-rate gnosticism so common today. Sorry, that was a preemptive strike. I'll step off my soapbox now...
Some theorists believe that a functional quantum computer will require engineering that does not segregate hardware (concrete) and software (abstract). Here's a clumsy example. Suppose your CPU was not a complex system of teeny tiny transistors and passive elements wired together on a silicon substrate, to which you applied electrical signals sequentially according to instructions that are encoded and stored elsewhere. Rather, suppose you were able to write the program code *on* the silicon (or whatever material would actually work in this integrative way) and, because the formal distinctiveness and arrangement of the code was *itself* functional, the hardware would then begin executing those instructions, i.e., it would become a self-operating system in which hardware and software were not independent. Even more crudely, suppose you could write the instruction to "spin one radian per second" in this functional code on a piece of wood and, in the right environment and under the right conditions, the wood would indeed spin at one radian per second. And even more remarkably, if you broke off a splinter of the wood, the splinter would still execute the instruction, positioned as if the whole piece of wood were still there. Think fractals and electromagnetic feng shui :-) Very speculative, of course, but that's what quantum entanglement invites!
Ben, sorry for this question, since I haven't had a chance to check out the book yet, but is this at all similar to what Moltmann describes as "Gestalt" in his Coming of God?
Ever read Isaac Asimov? He illustrates it pretty well...
Hi Dave: the parallel to Moltmann's "Gestalt" is an interesting question. The simple answer is: I don't know! I've never thought about Moltmann's view in relation to emergence -- there could be a real parallel here, even though Moltmann hasn't engaged directly with scientific work on emergence.
In God and Science (1996, SCM) Arthur Peacocke uses downward causation to express the idea that "God is the ultimate Boundary Condition of all-that-is" (p.19). God interacts casually with the world-as-a-whole without intervening in lower level events. To me this depersonalises God and fuzzies the boundary between creator and creation...any thoughts? I'm going to have to check out McGrath's take on this.
Thanks for the review Ross (and Ben!)
Thanks for the response Ben. I guess this is someething I'll have to do some more investigation on! I'm completely unfamiliar with this new strain of thought (I'm really bad at science--I literally did awful in Biology, Chemistry, and Physics in high school (all D's), and only performed well in the more obscure Geology and Astronomy in college!). Thanks again.
Oh, by the way, to all interested, Moltmann will be engaging with science in an upcoming conference at Duke for the Wesleyan Theological Society (along with Randy Maddox and James K.A. Smith, among others). Here's a link to the poster if you're interested. Peace.
Post a Comment