# writer, speaker, content creator

## You Know What Doesn’t Stand Up to Logic? The Book of Genesis.

In Religion on January 5, 2010 at 10:38 am Addendum: Seph has corrected my reasoning in this post in the comments section. Turns out the probability does not actually reach one. I shouldn’t try to post about math, but I still think that it’s utterly unreasonable to assume that a set of humans, over an infinite amount of time, wouldn’t try to do something interesting.

There are lots of things that bug me about the Bible. Lots of things. (And yes, this atheist has actually read every single word of the New Jerusalem edition that resides at my parents house. I was a precocious teenager…)

Today’s Dinosaur Comics reminds me of a particular annoyance of mine: Adam and Eve are essentially in a no-win situation in Genesis. Setting aside issues of Biblical literalism and evolution for the time being, the math just doesn’t work.

Here’s the problem: At this point, Death hasn’t been introduced into the world, yet, so Adam, Eve, and everything around them is immortal and can exist for an eternity. Also, the Tree of Knowledge is just sitting right there, and every day there’s the chance that they might eat it.

So, we’ve got a system where every moment there is a probability that something might happen, as well as an infinite amount of time. Within an infinite amount of time the probability of anything (except zero) become one. Therefore, over an infinite amount of time it is a mathematical certitude that one of them will eat the fruit. (And it does just say “fruit,” in the text, it’s only an apple by tradition.)

Think about it- if you roll a six sided die an infinite amount of times, the probability of rolling a four at some point becomes certain. Over an infinite amount of time, the probability humanity being expelled from the garden also becomes certain.

In conclusion, religion is kind of silly. One could point out things like this all day, but that would just be kind of cheap and misanthropic.

1. There was actually a long debate about this in a freshman-year literature class I took (talking about Paradise Lost). There's certainly nothing illogical about it, though. Unfair, perhaps, but not illogical. Maybe God's just mean?

That being said, it's not true that with infinite time the probability of anything happening becomes one.

Consider the following: some event (eating the fruit?) happens with probability 1/4 on the first day. It happens with probability 1/16 the second day, assuming it hasn't happened on the first day, and then 1/64 the third day, and so forth (i.e., on a given day i, the probability of the fruit being eaten is 1/(4^i), assuming they hadn't eaten it on a previous day).

In that case, the probability that the fruit is ever eaten is just under 1/3 (the exact value, due to my officemate C, is approximately 0.31146246288), which is neither zero nor one.

So, no, humanity need not be expelled from the garden of Eden. Maybe God just enjoys infinite series.

2. Okay, so why does the probability of the fruit being eaten go down with time? If you have a die, the chance of it showing a three is 1/6 every single time. Why is this system different?

3. Why does the probability of eating the fruit have to be the same every day? Maybe the fruit looks less appealing every day as the novelty value wears off.

4. Well, if anything you'd think the probability would increase. The fruit is always there, but God's prohibitions become more and more distant with each passing day.

What I'm curious about, though, is what you said about how it's not true that the probability of anything happening over infinite time is one.

I keep thinking about a die, an everyday sort of random number generator. One would assume that over infinite time that die would generate an infinite amount of ones, twos, threes, fours, fives, and sixes. So, because we know it will generate an infinite amount of threes, the probability of just one three occurring is one, right?

5. Might increase, might decrease, who knows? I'm not going to debate that.

That's not the way the math works, though. For a die, the probability of rolling a three on any given day is 1/6. For two days, the probability that you roll a three on either day is the probability that you roll two dice and get at least one three (11/36). For three days, it's 89/216 (the details of why don't really matter).

For any finite number of days, there is a probability less than one that you roll a three. As the number of days approaches infinity, that probability gets arbitrarily close to one (but is never actually one, since you can't "reach" infinity).

Indeed, in any case where you have a constant probability of an event occurring, the probability of it occurring approaches one with an infinite number of trials (but, again, never reaches one, 'cause you can't "reach" infinity).

In the case I present, though, the probability of the event happening on the first day is 1/4. The probability of the event happening on the second day is the probability it didn't happen on the first day (3/4) times the probability it happens on the second day (1/16), or 3/64. The probability that it happens on either of the first two days is then 1/4+3/64=19/64. By the same logic, the probability it happens on one of the first three days is 1261/4096. And so on.

With this particular function, it turns out that as the number of days approaches infinity, the probability of the event ever happening approaches 0.31146246288, not 1. (It'll never actually reach that value, since you can't reach infinity, but it'll get arbitrarily close).

It's a weird thing about math that you can add together an infinite number of finite numbers and get a non-infinite result. For example, for n=1 to infinite, the sum of 1/2^n is one, which is kinda cool, since you're adding an infinite amount of numbers together.

But I digress.

6. I am glad there was discussion on this. The old, atrophied part of my brain that could handle math of this nature insisted that the probabilty could never hit 1/1. It is kind of like the gambler's fallacy. The fact that Eve (or Adam) didn't eat the apple 4,328 times in a row doesn't make it any more or less likely that it will be eaten on the 4,329th day, all other things being equal. Although, as Seph has shown over enough time it is likely to be eaten.

7. Your edit is close, but not quite right. If the probability of eating the apple is constant then, given an infinite number of days, they will indeed eat the apple. On any given day, there's some (small) probability that they won't eat the apple. But over an infinite number of days, they will eat it.

But if the probability isn't constant over time, then the probability they eat the apple can be non-zero but less than one, even given an infinite number of days. Which is weird and non-intuitive, I agree. But still true.

Beau: You're right. There's some linguistic ambiguity that makes expressing this a little challenging. A mathematician would say that "the probability of eating the apple by day n approaches one as n approaches infinity." You can't really say that "the probability of eating the apple over an infinite number of days is one", since you can't experience an infinite number of days (infinity is not a number). You can say that the probability becomes arbitrarily close to one, given a sufficiently long (finite) period of time.

With some functions, this is true. With others (like my example), the probability it approaches need not be one. Your head asplode yet?

8. Hmm…I accept the idea WITH a geometric series, but logic says the probability of eating fruit from a tree in a garden that you're stuck in would not actually approach zero, which means that it woulnd't be a geometric series, which means they'd totally eat it eventually.

9. fredgoat: yeah, I can't imagine that the probability would follow that pattern, either. I just enjoy being mathematically rigorous.

10. This is of course all assuming that the desire to eat the apple is consistent throughout. What if they forget the apple exists?