## Calculating the Probability of Abiogenesis

There is a misconception prevalent in society today: the Darwinian theory of evolution by way of natural selection acting on random variation explains the origin of life. Ignoring for the moment whether or not the theory accurately explains the origin of the species, the fact of the matter is that the Darwinian theory says nothing with respect to how life began. Thus, proponents of naturalism are forced into the position of conjoining Darwinism with the theory of abiogenesis—organic molecules and subsequent simple life forms first originated from inorganic substances. Naturally (excuse the pun), naturalists must posit that this process was purely unguided; all of the inorganic components of the hypothetical first living cell (hereafter, FLC) arranged themselves by chance (i.e., without the help of a guiding intelligence). Many uncritically embrace this theory, all the while oblivious to the insurmountable mathematical improbability it faces. In this post I shall lay out this improbability and offer some comparative calculations to demonstrate its unreasonableness.

In his book Signature in the Cell1, Dr. Stephen Meyer explains how building a living cell not only requires specified information, it requires a vast amount of it—and the probability of this amount of specified information arising by chance is vanishingly small. But how small? What exactly is the probability that the information necessary to build the first living cell would arise by chance alone? During Darwin’s time, it was virtually impossible to quantify the answer to this question because biologists did not know exactly how much information was necessary to build and maintain the simplest living cell. But beginning in the 1990’s, scientists began to do “minimal complexity” experiments in which they tried to reduce cellular function to its simplest form. Thus, biologists have been able to make increasingly informed estimates of the minimum number of proteins and genes that a hypothetical FLC might have needed to survive.

Mycoplasma genitalium, a tiny bacterium that inhabits the human urinary tract, is the simplest extant cell. It requires 482 proteins to perform its necessary functions and 562,000 nucleotide bases of DNA (just under 1,200 base pairs per gene) to assemble those proteins. Based upon the most conservative minimal-complexity experiments, some scientists speculate (but have not yet demonstrated) that FLC might have been able to survive with as few as 250–400 genes. Of course, building a functional cell would require much more than just the genetic information responsible for directing protein synthesis; at the very least, it would also need a suite of preexisting proteins and RNA molecules—polymerases, tRNAs, ribosomal RNAs, mRNAs, synthetases, ribosomal proteins, etc.—to process and express the information stored in DNA. Therefore, any hypothetical FLC would have required not only genetic information, but a sizable preexisting suite of proteins for processing that information. (Building such a cell also would have required other preexisting components. But I will be generous to naturalism and grant them all for the sake of the argument.)

To calculate this probability, scientists typically use a slightly indirect method. First they calculate the probability of a single functional protein of average length arising by chance alone. Then they multiply that probability by the probability of each of the other necessary proteins arising by chance. The product of these probabilities determines the probability that all the proteins necessary to service a minimally complex cell would come together by chance. Further, since measures of probability and information are related (albeit, inversely), the probability that a particular gene would arise by chance is roughly the same as the probability that its corresponding gene product (the protein that the gene encodes) would do so. For that reason, the relevant probability calculation can be made either by analyzing the odds of arranging amino acids into a functional protein or by analyzing the odds of arranging nucleotide bases into a gene that encodes that protein.

Consider the way this combinatorial problem might play itself out in the case of proteins in a hypothetical prebiotic soup. To construct even one short protein molecule of 150 amino acids by chance within the prebiotic soup there are several combinatorial problems—probabilistic hurdles—to overcome. Lets assume that FLC needs only 250 genes to survive. Now lets assume that each of these 250 genes require only a mere 150 amino acids each. At each position on the gene, any one of the 20 amino acids could occur with equal probability. What’s more, these amino acids must form a peptide bond or else they won’t fold into a functional protein. The problem is that amino acids form peptide and non-peptide bonds with equal probability. Thus, with each amino acid addition the probability of forming a peptide bond is 1/2. Therefore, for a chain 150 amino acids long, the probability of forming all peptide bonds is (1/2)149, or about 1 in 1045.

Next, every amino acid found in proteins (except one) has a mirror image of itself—there is a left-handed version and a right-handed version. Functional proteins tolerate only left-handed versions. Yet, in abiotic amino acid production, right-handed isomers and left-handed isomers occur with roughly equal frequency. Therefore, the probability of attaining only L-form isomers in a chain of 150 amino acids is again (1/2)149, or approximately 1 in 1045. Therefore, the probability of randomly forming a chain of 150 amino acids where each is not only a peptide bond, but also a left-handed isomer, is approximately 1 in 1090.

Finally, there is a third independent requirement. Most importantly, the amino acids—like alphabetical characters in a meaningful sentence—must be in functionally specified sequential arrangements. Because there are 20 biologically occuring amino acids, the chance of getting any one of them at a specific site is 1/20. Molecular biologists have found that in many cases the proteins can tolerate several of the 20 different amino acids along the chain without destroying function. Douglas Axe was able to experimentally demonstrate2 that the ratio of tolerance while still producing a functional protein 150 amino acids in length yields a probability of 1 in 1074.

Therefore, the probability that a 150 amino acid compound assembled by random interactions in a prebiotic soup would form a functional protein is the product of each of the 3 individual probabilities:

Probability of only peptide bonds = 1 in 1045

Probability of only L-form isomers = 1 in 1045

Probability of achieving correct amino acid sequencing = 1 in 1074

= 1 in 10164

Therefore, if the FLC requires 250 proteins that, on average, are 150 amino acids in length, and each individual protein yields a probability of 1 in 10164, then the probability of 250 necessary proteins forming in order to service a minimally complex cell is 1 in 1041,000!

1 in 1041,000. Here are a couple of ways to demonstrate just how astronomical this number is:

First, in order to win the Mega Millions Jackpot, you must correctly select 5 numbers from a pool of 56, while also selecting 1 additional number from a pool of 46. The probability of winning once? 1 in 175,711,536. Considering there are 2 drawings per week, you would need to win 5,408 consecutive drawings to achieve something with a probability almost identical to that of the first living cell forming by chance. In other words, you have a better chance of winning the jackpot in every consecutive drawing for the next 52 years!

Finally, all matter in the universe is composed of subatomic particles. These particles are virtually infinitesimal in size. For instance, the number of particles in a 16 ounce glass of water is much greater than the number of glasses that it would take to scoop all of the water out of all the oceans on the earth. In other words, there are lots of particles in even a small glass of water. These particles are scattered throughout our observable universe. How big is our observable universe? Light travels at a speed of 186,282 miles per second. Thus, it would take light approximately 93,000,000,000 years to go from one edge of the observable universe to the other. Moreover, in this astronomical volume of space, these infinitesimal sub-atomic particles are scattered throughout. How many particles are in the universe? 1080. This number defies the imagination. But compared to 1041,000, 1080 is infinitesimal. What’s more, each of these particles can change its position 1045 times per second. This is known as an “event”. If the universe were a 1,000,000,000 times older than the typical estimated age, the total number of seconds since the Big Bang would be 1025. Therefore, the number of events—the product of the number of times every particle in the observable universe has changed it’s position since the Big Bang—that have occurred is 10150. That is every event that has ever occurred, and then some.

Compare 10150 to 1041,000. If I told you that you had one chance to guess which specific 273 events that I’m thinking of that have ever occurred since the beginning of the universe, you would have a better chance of picking all 273 on your first guess than a living cell has of forming by chance!  Remember, there have been 10,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 events. Anyone want to bet that they can pick all 273 in one guess?

## The Solution to Fatalism

Have you ever heard someone ask, “If God is omniscient (i.e., all-knowing) and therefore already knows what I am going to choose, then how am I free to to choose?” What the question is fundamentally asking is, How can God’s omniscience and human libertarian free will possibly coexist?

Before answering the question, let us first define some terms. The standard definition of omniscience says that, for any person $S$, $S$ is omniscient if and only if $S$ knows all true propositions and believes no false proposition. What this entails is that God, being omniscient, must know the truth value of all future tense propositions (e.g., Jones will sign with the Saints); He would be said to have foreknowledge of the events described by these propositions.

On the standard definition of libertarianism, a person $S$ has libertarian freedom of the will if and only if, when given a choice to do either $x$ or $\neg x$, nothing determines the choice that $S$ will make; he simply exercises his own causal powers to either choose $x$, or to refrain from choosing $x$.

Finally, fatalism is the view that everything that happens does so necessarily (i.e., it could not have possibly happened any different from the way it did in fact happen).

With those definitions in mind, let us now get back to the question that was posed. For many people, it would seem that omniscience implies fatalism: if God already knows what I’m going to choose, then I can’t possibly choose differently because, if I were to, it would entail that God would be wrong about what He believed that I would choose (i.e., He would have believed a false proposition); which would thus entail that He is not in fact omniscient.

Is there any way to reconcile free will and omniscience and solve this “dilemma?” Yes, there is. For starters, what needs to be understood is that logic of this sort commits a modal fallacy: it is what medieval philosophers called confusing the necessitas consequentiae (necessity of the consequences or the inference) with the necessitas consequentis (necessity of the consequent). Formally, the reasoning is as follows:

1. $\square(P \implies Q)$
2. $P$
3. $\therefore \square Q$

Let $P$ represent the proposition “God believes Jones will sign with the Saints.” Let $Q$ represent the proposition “Jones will sign with the Saints. When we plug these variables into the argument, we get

1. Necessarily, if God believes that Jones will sign with the Saints, then Jones will sign with the Saints.
2. God believes that Jones will sign with the Saints.
3. Therefore, necessarily, Jones will sign with the Saints.

Why is this reasoning a fallacy of modal logic? Because the proper inference is (3′) Therefore, Jones will sign with the Saints. Did you catch the difference between 3 and 3’? The latter excludes the word necessarily. Indeed, all that follows from the premises is that Jones will sign with the Saints; it does not follow that he will do so necessarily. When the time comes to make the choice whether or not to sign with the Saints, Jones is completely free to do as he pleases—he can either sign, or not sign. Thus, his choice—no matter what it is—will be a contingent truth; not a necessary truth.

What premise 1 entails is that if God believes that Jones will sign with the Saints, then he will sign with the Saints. But it is not necessary that God believe this. Indeed, the source of God’s belief here is the free choice that Jones will make when the time comes. Suppose when the time comes Jones will choose to sign; God will have known that all along. But suppose instead that when the time comes Jones chooses not to sign; all that follows is that God would have believed that all along instead. Hence, there is no dilemma.

## Extending the BGV Theorem to Cosmogonic Models Positing an Infinite Contraction

You all will have to forgive me for making such a technical post as this blog’s first; it just so happens that this has been the topic of my recent research. Sorry.

For those of you who do not know, there was a kinematic incompleteness theorem (http://arxiv.org/abs/gr-qc/0110012v2) proven in 2003 that (in very non-technical terms) demonstrates that any universe that has been, on average, expanding throughout its history, cannot be eternal to the past, but must have an past spacetime boundary (i.e., a beginning). Well, it just so happens that our universe has been, by all appearances, expanding throughout its history.

However, there have been many attempts to craft highly speculative cosmogonic models that evade the Borde-Guth-Vilenkin theorem (hereafter, BGV). One of which posits an infinite contraction prior to a bounce, followed by a subsequent contraction phase. It is my contention that such a model does not in fact evade the theorem, and I shall engage in an endeavor to articulate why I believe BGV to conflict with  spacetimes in this post. I want to first lay out some basics:

1.      A spacetime is past-incomplete if there is a null (or timelike) geodesic maximally extended to the past that is finite in length.

2.      As long as the expansion rate averaged over the affine parameter λ along a geodesic is positive (), BGV proves that there will be causal geodesics that, when extended to the past of an arbitrary point, reach the boundary of the inflating region of spacetime in a finite proper time τ (finite affine length, in the null case).

3.      The measure of temporal duration from  is a quantity that is actually infinite () rather than potentially infinite ().

Now, if the velocity of a geodesic observer  (relative to commoving observers in an expanding congruence) in an inertial reference frame is measured at an arbitrary time  to be any finite nonzero value, then she will necessarily reach the speed of light at some time  and the interval  will be have a finite value. If an infinite contraction preceded the “bounce” (and indeed it must do so necessarily if  is to be avoided), then the time coordinate  will run monotonically from  as spacetime contracts during, bounces at , then expands for all

Thus, if what I have argued above is correct, then the implications are unmistakable: as long as :

1.  is a non-comoving geodesic observer;
2.  is in an inertial reference frame;
3.  is moving from ;
4.  has been tracing a contracting spacetime where ;

it therefore follows that the relative velocity of  will get faster and faster as she approaches the bounce at . Moreover, since we know that will reach the speed of light in a finite proper time , coupled with the fact that the interval  is infinite, we can be sure the she will reach the speed of light well-before ever making it to the bounceand therefore cannot be geodesically complete.