Yes, I've been arguing on the internet again. I guess I'll never learn.
I try not to spend too much of my time debating other people's religion -- it's very hard to stop once you start. In general these days I take a live-and-let-live approach: if you believe in God, that's fine, we can agree to disagree. Provided, that is, that you aren't doing one of two things: (a) accusing others of wrongdoing on insufficient grounds, or (b) setting yourself up as a guru dispensing wisdom. Either of those, I take as a licence to ask probing questions until the requisite evidence is forthcoming.
And, of course, if you do want to talk about whether God exists and why I think he doesn't, then I'm ready and willing to reply. This is more or less what happened recently, after someone I know posted a Facebook post that fit (b), above, pretty well.
I started off with Richard Dawkins' argument from The God Delusion. I must admit that when I was a Christian, I wouldn't have been able to read The God Delusion. Dawkins' frequent use of mockery to underscore points would have made me too angry to notice that his argument is, in fact, pretty solid. Here it is, boiled down to the bones for an article in the Huffington Post:
To be sure, we do need some kind of explanation for the origin of all things. Physicists and cosmologists are hard at work on the problem. But whatever the answer - a random quantum fluctuation or a Hawking/Penrose singularity or whatever we end up calling it - it will be simple. Complex, statistically improbable things, by definition, don't just happen; they demand an explanation in their own right. They are impotent to terminate regresses, in a way that simple things are not. The first cause cannot have been an intelligence - let alone an intelligence that answers prayers and enjoys being worshipped. Intelligent, creative, complex, statistically improbable things come late into the universe, as the product of evolution or some other process of gradual escalation from simple beginnings. They come late into the universe and therefore cannot be responsible for designing it.All the Christian replies I have seen to this take much the same tack, and my Facebook interlocutor was no exception. He argued:
In fact Dawkin’s argument has been addressed numerous times almost no philosopher of religion theist or atheist I know of considers it of merit.That is to say: the chance that a God might begin to exist is indeed absurdly low, but (on this view) that tells us nothing about the chance of his already existing. Not to put too fine a point on it, this is where the argument goes wrong.
First, you state
“Richard Dawkins' objection to the existence of God is that the chance of such a phenomenally complex entity existing without a prior selective evolutionary process producing him is too absurdly small to be worth contemplating (far smaller than the likelihood that there is somebody waiting for me with a gun around the next corner, a possibility on which I cheerfully bet my life dozens of times a day).”
This is confused, as whats improbable is that an “phenomenally complex entity existing” could come into being “ without a prior selective evolutionary process producing him” not the existence of such a being itself. Moreover the appeal to evolutionary processes here indexes this probability to known facts of science. In otherwords given what we know about scientific processes its improbable a complex being can come into existence without a prior evolutionary process.
So this argument really only applies to physical beings subject to the laws of physics that come into existence. Seeing God as understand by Jews and Christians is not a physical being that came into existence this proves very little.
The astronomer Fred Hoyle, in a passage widely quoted by creationists, compared evolution to the assembly of a Boeing 747 by a tornado in a junkyard. The comparison was based on a misunderstanding, but we'll come back to that. The force of Hoyle's image rests on our instinctive grasp of the impossibility of such a thing.
But what's impossible about it? Presuming that the junkyard did indeed contain pieces of all the materials needed to make a Boeing 747, why is it still absurd to suppose that a tornado might create one?
Let's switch images for a moment. Imagine this video clip: A wet floor, strewn with glass shards. Suddenly the moisture rises out of the floor and collects itself in the centre; the shards leap together, fusing as they meet, to form a drinking-glass. Glass and water rise up to the tabletop, where they knock a passer-by's arm out of the way and come to rest.
Or there was the video I saw at the Art School's end-of-year exhibition a few years ago, of a young woman in a bath of thick purple liquid apparently trying to sponge herself clean -- until the end of the video, when the purple collected briefly in a cloud around her, and then she stepped, painted purple from head to foot, out of the clean clear water.
In neither case is the riddle very difficult: the video is being played backwards! But how do we know? Why don't glass shards spontaneously assemble themselves into drinking-glasses? Why doesn't mucky water spontaneously clean itself?
You can see that this is the same question as with the Boeing 747 and the tornado. And in each case, the answer is the same: because the target state is too small. There are millions (of millions of millions of millions) of possible arrangements of the shards of glass, and practically all of them are just random scatterings. To call the fraction of them that form drinking-glasses "tiny" would be pathetically inadequate. Technically, that fraction is larger than zero, but you would have to wait much longer than the age of the universe for it to become large enough to be worth bothering with. The same goes for the paint molecules in the bath, and the Boeing parts in the junkyard.
"But you're still talking about things coming into existence," readers will argue. "That doesn't tell us anything about the probability of something already existing." In each case, we know that the improbable object -- the drinking-glass, the clear water, the Boeing 747 -- really does exist. It just doesn't come into existence at random.
Let me approach it this way. Isn't there another problem when you start thinking about the probability of something already existing? Either it does exist, in which case the probability is 1, or it doesn't exist, in which case the probability is 0. Right? No. When we talk about "the probability of something existing", we mean "the probability of our guess that it exists being correct". And that's where that tiny target state comes in.
With a drinking-glass, a bathful of clean water, or a Boeing 747, the probability of our guess being correct is tremendously enlarged -- not because any of them encompass a large number of possible states of matter, but by our prior knowledge. We already know there are such things as utensils, clean water, and aeroplanes. Given that knowledge, the existence of a particular instance of any of them is not a stretch.
That's more or less where Hoyle went wrong with his analogy. Some states of matter, such as gas clouds, are predictable because they are forgiving; most possible arrangements of matter within a given space count as gas clouds. Others are predictable though highly specific because once one exists -- however unlikely that first appearance was -- it can make more; the existence of one makes the existence of others a trivial prediction.
Presumably the copies won't all be perfect. Presumably most of those that are different from the parent will be worse at making copies of themselves; for which very reason, their numbers will rapidly dwindle. Occasionally, mutations will arise that are better at making copies of themselves. Rare though these mutation events certainly will be, when they do happen copies of the mutants will soon crowd out the older forms. You'll end up with swarms of complex entities that -- were it not for self-replication -- would be so incredibly improbable that we would confidently rule them out of existence altogether. Some of these are the replicants themselves, such as elephants and kowhai trees and ichneumon wasps; others are by-products of the replicants' self-replication processes, such as beehives and beaver dams and Boeings.
But God -- God, as understood by Jews and Christians, which is evidently the kind of God we are talking about, is what they call a "supernatural" entity; I have yet to see a definition of "supernatural" that doesn't, ultimately, reduce to "something that never leaves any evidence of its existence (so take that, science, ha ha)". By definition, nothing and nobody caused him to begin to be. There is no entity already known to exist that makes God's existence a trivial prediction. And, being supernatural, he also doesn't fit into the other category of predictable entities; there is no possible arrangement of matter that counts as God!
My correspondent, as you saw, thinks that Dawkins' argument requires that God be subject to the laws of physics. He enlarges on this in the second part of the argument:
Second you state"As Dawkins defines complexity, a being is complex if it's a material entity with parts." This is false. Dawkins' omission of an explanation of what he means by "complex" is indeed a weak point in the argument of The God Delusion. An entity is complex if it has a high information content, and Dawkins explains what that means, in a different context, in his article The Information Challenge.
“The usual answer to this is that God doesn't have to be complex, he's extremely simple -- "pure thought" or something. To which I reply, information capacity and complexity are two different ways of expressing the same mathematical facts. Either God is complex and Dawkins' argument applies, or he has an extremely limited information capacity and cannot be called God. (Or, third option, he evolved from simpler beginnings by natural selection.)”
This argument again is confused, First, as Dawkin’s defines complexity a being is complex if it’s a material enitity with parts. So, he cannot suddenly appeal to mathematical complexity without committing the fallacy of equivocation. If mathematical complexity were the issue then the first premise would have to be that its highly improbable that mathematical truths could exist prior selective evolutionary process producing him and that premise would be false. Its actually logically and mathematically impossible for these truths to not exist.
Let me draw out the relevant points from that article: First, information content is independent of the medium carrying it. The letters you are reading started as pressure on a keyboard, became a series of variations in electrical voltage, became an arrangement of tiny switches in a server somewhere, then back to electrical fluctuations through several machines, before becoming a stream of light falling on your retinas... but, at every step, they're the same letters. So what does it mean to say that things have changed because the medium carrying the information is "supernatural"?
Not a heck of a lot, it turns out. Information can be defined as reduction of prior uncertainty. Here, Dawkins can explain it better than me:
The technical definition of “information” was introduced by the American engineer Claude Shannon in 1948... The measure he came up with was ingenious and intuitively satisfying. Let’s estimate, he suggested, the receiver’s ignorance or uncertainty before receiving the message, and then compare it with the receiver’s remaining ignorance after receiving the message. The quantity of ignorance-reduction is the information content. Shannon’s unit of information is the bit, short for “binary digit”. One bit is defined as the amount of information needed to halve the receiver’s prior uncertainty, however great that prior uncertainty was (mathematical readers will notice that the bit is, therefore, a logarithmic measure).Eight bits make a byte, which amounts to a reduction of uncertainty by a factor of 256 (2x2x2x2x2x2x2x2). Because information theory is vital to the functioning of computers, most of us encounter the technical terms of information theory (such as "gigabyte") mainly in the context of dealing with computers; it is necessary, therefore, to dispel the misconception that information theory is a theory about computers. It isn't. Computers just happen to be the first artificial object whose function is to handle information on a large scale; nature beat us to it, first with genomes (which work very like computers on the inside) and then with brains (which don't, but which computers are designed, increasingly well, to work with).
...An expectant father watches the Caesarian birth of his child through a window into the operating theatre. He can’t see any details, so a nurse has agreed to hold up a pink card if it is a girl, blue for a boy. How much information is conveyed when, say, the nurse flourishes the pink card to the delighted father? The answer is one bit -- the prior uncertainty is halved. The father knows that a baby of some kind has been born, so his uncertainty amounts to just two possibilities -- boy and girl -- and they are (for purposes of this discussion) equal. The pink card halves the father’s prior uncertainty from two possibilities to one (girl). If there’d been no pink card but a doctor had walked out of the operating theatre, shook the father’s hand and said “Congratulations old chap, I’m delighted to be the first to tell you that you have a daughter”, the information conveyed by the 17 word message would still be only one bit.
One bit is the amount of information conveyed by the answer to a yes-no question: what we IT boffins call a "Boolean". It reduces uncertainty by 50%; that is to say, a single bit of information on any subject doubles the probability that our guess is right. To put it another way again, if a system has an information capacity of one bit, our probability of guessing its state correctly -- absent that information -- is 0.5.
Two bits is the amount of information conveyed by the answer to two Boolean questions, or one four-option multi-choice question; our probability of guessing correctly is 0.25. Three bits, and that probability is halved again, to 0.125. Four bits, 0.0625, and so on.
How much information does it take to specify the blueprint of a Boeing 747? I don't know, but I would guess tens of megabytes. A megabyte is about 8 million bits. So the raw chance of a Boeing 747 existing -- the probability that we had guessed right, if we predicted that a particular one existed, without the crucial information that Boeings and the system for producing them already exist -- would be one in about 2^(8 million). My computer won't go there, but I estimate 607 to 610 zeroes after the decimal point before you get to anything that isn't a zero.
To reiterate: To say that a Boeing 747 contains tens of megabytes of information, and to say that the probability of its existing "by itself" has over six hundred zeroes after the decimal point, are two ways of saying the same thing. It so happens that a Boeing 747 is a physical entity with parts, but that is not what makes it impossible for a tornado in a junkyard to create one. What makes that impossible is the number of yes-no questions it would take to specify the blueprint.
Second, Christians have traditionally understood God as simple, that’s correct, which of course shows Dawkins commits a straw man when he says God is complex. The problem is your argument that God cannot be complex fails. Your claim that God is complex again equivocates, what your argument shows is that God has awareness of complicated mathematical information. But the fact a being has awareness of complicated information does not follow the being is complex, that is composed of multiple parts.A point on the side: Dawkins does not claim that "Christians think God is complex"; that would indeed be a straw-man. Dawkins claims that, for God to do what Christians claim he does (answer prayers, forgive sins, and so on) he would have to be complex, whether Christians accept that or not. That is not a straw-man but what is called a reductio, a perfectly legitimate argument drawing out the implications of one's opponent's position.
Of course it’s true that if God has a physical brain, given what we know of physical brains it would have to be a complex physical entity. But again that simply shows Dawkins apparently understood God to be an evolved physical entity, which is a stupid caricature of how God is understood in Christian or Jewish traditions. None of this is new btw you can read it any of the numerous philosophical critiques of Dawkin’s book.
It may not follow from the fact that a being has awareness of complex information that it is composed of multiple parts; but Dawkins' argument does not depend on God being composed of multiple parts. That "stupid caricature" exists only in his opponents' heads. The argument depends purely on God's information capacity as such. If God's information capacity is infinite, as several parties to the debate insisted, then the probability of his existence is exactly zero -- we can be absolutely certain that he does not exist. These are two different ways of saying the same thing.
So perhaps the fact that no theologian takes Dawkins seriously says more about theology than it says about Dawkins. Let's face it, people are going to be drawn to the study of religion partly by the fact that there are no tricky equations to solve. That being the case, Dawkins really should have explained information theory and complexity explicitly in his book, as he did in the article I have referenced. I hope that this little note has helped to remedy that lack.