Home  Events  Prophecies  Seeking  Bible  Index (upd:1/14/09) Toolbar Legend:    <<  |< <=  => >|  >>
*** < < < The Probabilities of Accidental Creation -vs- Complexity > > > ***
  <<< follow Satan < < < Your Choice is Your Eternity > > > follow Christ >>> 
 
Table of Contents (scroll to (as you read), or click-on (to go to), these subheadings )  
PREFACE

We are all familiar with the idea that "if we keep on trying" we may eventually "get it right."

If no intelligence were in this universe, then maybe there could be endless occurrences of random accidents that would just keep on happening until something functional (complex) was produced. On a companion web page we will later explore these logical concepts.  On this web page let's explore the mathematical concepts, possibilities and constraints, and answer questions like:

What are the Mathematical Probabilities of Random / Accidental Creation -vs- Complexity ?

What would it take to Create Something Complex from Random Accidents ?

What are the odds that Something Complex comes into being, as the result of Random Accidents ?

What are the probabilities (chances) of creating useful complexity either randomly or accidentally ?

Complexity .. Measuring Complexity .. Quantifying Complexity .. Mathematics that describes Complexity

Typical statements involving concepts and degrees of complexity:
Something that is very simple (not complex)
should happen easily or frequently.
Something that is very complex, will probably
never happen randomly or accidentally.
The odds are high that it could happen. The odds are very low that it will ever happen.
The chances of it ever happening are very low.
It happens every few "tries." The odds/chances of it happening are so low,
that it won't happen in a "billion years."
Flipping 3 pennies all at once comes up (from left
to right)   "heads, heads, tails" on the average of
once every 8 times (once every 8 tries).  8 tries =
2*2*2 = 2^3 = 2^(3 pennies) = 2^(3 bits of info).
The odds are so low that
    I would not bet any money on it,
        nor my future,
            nor my eternity.
 
Don't mind the math.  You probably learned all of it in school, but have since forgotten it.  The important thing here is what the math is telling us, the results.  Read through the math to get the gist of what is going on, but don't worry about any math details you are rusty on.  You can always get someone else later to explain and refresh your memory on the math details.
 
 
Introduction to Measuring Complexity

From a "theory of evolution" perspective, everything that happens must be a truly random event or accident (with no plan, rhyme, or reason, or help, or intelligence from God).  So let's rely on a purely mathematical analysis of what happens.  (i.e. Let's rely on pure science; nothing religious about it.)

How do we measure complexity?  In our computer era, complexity is easily measured by the number of "bits of information" that it takes to describe the complexity of a created thing.  Complexity may include such things as spatial layouts, chemical or biological composition of materials used,  functional characteristics, feedback loops, programmed instructions, necessary information, et cetera.
 
 
Measuring the complexity of something very complex ( Windows XT )

An easy place to start is to measure the complexity of something very complex like the Windows XT operating system.

By doing a right-click on C:\WINDOWS and selecting "Properties", my PC tells me that my Windows XT operating system has a size of about 2.5 GB.  Since each byte contains 8 "bits of information", we can say that about 20 billion "bits of information" is required to describe the complexity of this Windows XT operating system.  ( 2.5 GB = 2.5 GigaBytes =~ 2.5 Billion Bytes.  2.5 Billion Bytes times 8 bits per byte = 20 billion "bits of information." )

The probability (odds, chances) of randomly or accidentally exactly duplicating this 20 billion "bits of information" is 1 in 2^(20 billion) = 1 in 10^(20 billion x 0.301029995664) = 1 in 10^(6,020,599,913) =
1 in "1 followed by 6,020,599,913 zeros".

We can conclude, with as close to absolute certainty as we can imagine, that this 20 billion "bits of information" is not going to be randomly or accidentally duplicated (or come into being that way).

Hopefully, we have grasped the concept of describing "complexity" in terms of its "bits of information."
 
 
Measuring the complexity of something that seems relatively simple (but surprisingly, is not)

Now, using a very simple example, let's explore how difficult it would be to accidentally, or randomly, create something that has any significant degree of complexity.

In computer lingo, we know that each letter can be represented by a byte, and that each byte can be represented by 8 "bits of information."  
7-letter
word to
create
"example" word to
create
letters /
bytes
e x a m p l e  = 7
letters/bytes
value of
bits (8 bits
per byte)
0110 0101 0111 1000 0110 0001 0110 1101 0111 0000 0110 1100 0110 0101  = 56 "bits of
information"
heads/tails
h/t
thht thth thhh httt thht ttth thht hhth thhh tttt thht hhtt thht thth  = 56 "bits of
information"
 

Each "bit of information" has a required value of 0 or 1, shown in the table above.

Now if we randomly (accidentally) flip each of the 56 pennies, the pennies can then be checked to see if they match the "thht thth  thhh httt  thht ttth  thht hhth  thhh tttt  thht hhtt  thht thth" sequence shown in the table above.  If we have a match, we have randomly (accidentally) created a representation of the 7-letter word "example".  If there is no match, we must randomly (accidentally) flip each of the 56 pennies again, and again, until we randomly (accidentally) create a match.
 
 
 
Flip how many times to get a match ?

Our next question is likely to be:  On the average, how many times do we have to flip all 56 pennies to get a match?

In our case, on the average, we would have to flip all 56 pennies 2^56 times to get a match (2^56 times on the average per match). Can you believe how many times we have to flip all 56 pennies, just to randomly (accidentally) create (a representation of) a simple 7-letter word like "example" ?  Believe it.  Its all in the mathematics, and the math is pure science (not religion).
 
 
 
If we flip once a second, how long would it take ?

Let's ask still another question.
If the 56 pennies were all flipped once every second, on the average, how long would elapse between matches?  (i.e. How long would it take, on the average, to get a match ?)

The answer is:  72,057,594,037,927,900 times, and at once per second = 72,057,594,037,927,900 seconds.

Can you believe how long it would take to flip all 56 pennies (at one flip of all 56 pennies each second), just to randomly (accidentally) create (a representation of) a simple 7-letter word like "example" ?  2.28 billion years?  Believe it.  Its all in the mathematics, and the math is pure science (nothing religious about it). ( Note:  If you want to introduce the maximum bit compression into this analysis, see the explanation below at the end of this web page.  Using an uncompressed 7-letter word like "example" here, will give us approximately the same results as a compressed 12-letter word like "conventional", as shown below at the end of this web page (end of this document).  
 
Increasing the complexity by adding "bits of information"

What if we want to increase the complexity of what is going to be randomly (accidentally) created?

A simple of rule of thumb is:  For each 10 "bits of information" we start with or add to the complexity of something, we must add 3 more zeros to the number of times all the pennies have to be flipped to get a match.

The formula we used for the Windows XT analysis above is repeated here: Also, we can use 8-bit bytes of information in the following formula:
 
 
Human DNA has 3 billion DNA base pairs, that include 20,000 protein-coding genes

What are the chances that human DNA came into being by random accidents?

According to Wikipedia, the human genome is the genome of Homo sapiens, which is stored on 23 chromosome pairs.  The haploid human genome occupies a total of just over 3 billion DNA base pairs, and has a data size of approximately 750,000,000 bytes.

Also, Wikipedia states that the haploid human genome contains an estimated 20,000 - 25,000 protein-coding genes.  In fact, only about 1.5% of the genome codes are for proteins.  For the moment, lets focus on only the 1.5% (x .015). Using our formulas above, we can calculate that the chances (the odds) that human DNA's 20,000 protein-coding genes came into being by random accidents is 1 in "1 followed by 27,092,700 zeros" (tries, or attempts, or times). Once again, because of the immense complexity of DNA, we can conclude, with as close to absolute certainty as we can imagine, that this 90 million "bits of DNA information" is not going to randomly or accidentally come into being.

( Note:  Above, we have been focusing on "bits of information" that represent the genetic code for all human beings.  However, in DNA detective work, we would focus on the differences in the human genetic code that would give us a unique DNA match or signature for each individual.

 
 
 
 
 
 
 
 
Conclusions

As shown in the various analyses above, mathematically speaking, the probability of something very complex being randomly (accidentally) created is as improbable as we can imagine or fathom.

Stated another way:  It is mathematically totally improbable that anything very complex ever came into being by accident or by any other random happening.

"Theory of evolution" "experts" are simply deemed "experts", by referring to themselves, and to each other, as "experts." If we seek the truth about the "theory of evolution" by thinking it through for ourselves, we will ultimately discover that it is neither a "fact," nor is it even true.  It is a "theory" full of flaws, and, as we have just calculated, a "theory" full of mathematical improbabilities. Conclusion:  Mathematically speaking, the "theory of evolution" is full of mathematical improbabilities.
 
 
 
 
 
 
 
 
 
 
"Complexity"  to  "Number of Tries / Attempts / Times"  conversion  TABLE

The table below enables us to quickly convert the complexity of something (measured in "bits of information") to the average number of tries / attempts needed to randomly (accidentally) create something of that complexity.     In this table, binary (computer) terminology is used where:

Average number of tries / attempts needed
to randomly (accidentally) create something of a certain complexity.
Complexity (in "bits of information" as measured by the value in one of the columns below) Average number of tries / attempts needed
to randomly (accidentally) create something of the complexity
shown in the columns to the left.
= 2^(# bits)
#Tera
Bytes
# TiB
#Giga
Bytes
# GiB
#Mega
Bytes
# MiB
#Kilo
Bytes
# KiB
#
bytes
# bits
          1 2 E+0 2
          2 4 E+0 4
          3 8 E+0 8
          4 1.6 E+1 16
          5 3.2 E+1 32
          6 6.4 E+1 64
          7 1.28 E+2 128
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
        1 8 2.56 E+2 256
          9 5.12 E+2 512
          10 1.024 E+3 1,024
        2 16 6.554 E+4 65,536
          20 1.049 E+6 1,048,576
        3 24 1.678 E+7 16,777,216
          30 1.074 E+9 1,073,741,824
        4 32 4.295 E+9 4,294,967,296
        5 40 1.100 E+12 1,099,511,627,776
        6 48 2.815 E+14 281,474,976,710,656
          50 1.126 E+15 1,125,899,906,842,620
        7 56 7.206 E+16 72,057,594,037,927,900
          60 1.153 E+18 1,152,921,504,606,850,000
        8 64 1.845 E+19 18,446,744,073,709,600,000
          70 1.181 E+21 Note:  Results above are
only accurate to 15 digits.
Hence, the trailing zeros.
        9 72 4.722 E+21
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
        10 80 1.209 E+24  
        11 88 3.095 E+26  
          90 1.238 E+27  
        12 96 7.923 E+28  
          100 1.268 E+30  
        20 160 1.462 E+48  
        30 240 1.767 E+72  
        40 320 2.136 E+96  
        50 400 2.582 E+120  
        60 480 3.122 E+144  
        70 560 3.774 E+168  
        80 640 4.562 E+192  
        90 720 5.516 E+216  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
        100 800 6.668 E+240  
        200 1,600 4.446 E+481  
        300 2,400 2.965 E+722  
        400 3,200 1.977 E+963  
        500 4,000 1.318 E+1,204  
        600 4,800 8.790 E+1,444  
        700 5,600 5.861 E+1,685  
        800 6,400 3.908 E+1,926  
        900 7,200 2.606 E+2,167  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
        1,000 8,000 1.738 E+2,408  
      1 1,024 8,192 1.091 E+2,466  
      2 2,048   1.190 E+4,932  
      3 3,072   1.298 E+7,398  
      4 4,096   1.415 E+9,864  
      5 5,120   1.544 E+12,330  
      6 6,144   1.684 E+14,796  
      7 7,168   1.837 E+17,262  
      8 8,192   2.004 E+19,728  
      9 9,216   2.185 E+22,194  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
      10     2.384 E+24,660  
      20     5.682 E+49,320  
      30     1.354 E+73,981  
      40     3.228 E+98,641  
      50     7.695 E+123,301  
      60     1.834 E+147,962  
      70     4.372 E+172,622  
      80     1.042 E+197,283  
      90     2.484 E+221,943  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
      100     5.922 E+246,603  
      200     3.507 E+493,207  
      300     2.077 E+739,811  
      400     1.230 E+986,415  
      500     7.282 E+1,233,018  
      600     4.312 E+1,479,622  
      700     2.553 E+1,726,226  
      800     1.512 E+1,972,830  
      900     8.954 E+2,219,433  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
      1,000     5.302 E+2,466,037  
    1 1,024     4.264 E+2,525,222  
    2 2,048     1.819 E+5,050,445  
    3 3,072     7.755 E+7,575,667  
    4 4,096     3.307 E+10,100,890  
    5 5,120     1.410 E+12,626,113  
    6 6,144     6.015 E+15,151,335  
    7 7,168     2.565 E+17,676,558  
    8 8,192     1.094 E+20,201,781  
    9 9,216     4.664 E+22,727,003  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
    10       1.989 E+25,252,226  
    20       3.957 E+50,504,452  
    30       7.871 E+75,756,678  
    40       1.566 E+101,008,905  
    50       3.114 E+126,261,131  
    60       6.195 E+151,513,357  
    70       1.232 E+176,765,584  
    80       2.451 E+202,017,810  
    90       4.875 E+227,270,036  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
    100       9.698 E+252,522,262  
    200       9.405 E+505,044,525  
    300       9.121 E+757,566,788  
    400       8.846 E+1,010,089,051  
    500       8.579 E+1,262,611,314  
    600       8.320 E+1,515,133,577  
    700       8.068 E+1,767,655,840  
    800       7.825 E+2,020,178,103  
    900       7.588 E+2,272,700,366  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
    1,000       7.359 E+2,525,222,629  
  1 1,024       9.630 E+2,585,827,972  
  2 2,048       9.274 E+5,171,655,945  
  3 3,072       8.932 E+7,757,483,918  
  4 4,096       8.601 E+10,343,311,891  
  5 5,120       8.283 E+12,929,139,864  
  6 6,144       7.977 E+15,514,967,837  
  7 7,168       7.682 E+18,100,795,810  
  8 8,192       7.398 E+20,686,623,783  
  9 9,216       7.125 E+23,272,451,756  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
  10         6.862 E+25,858,279,729  
  20         4.708 E+51,716,559,459  
  30         3.230 E+77,574,839,189  
  40         2.217 E+103,433,118,919  
  50         1.521 E+129,291,398,649  
  60         1.044 E+155,149,678,379  
  70         7.161 E+181,007,958,108  
  80         4.913 E+206,866,237,838  
  90         3.371 E+232,724,517,568  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
  100         2.313 E+258,582,797,298  
  200         5.351 E+517,165,594,596  
  300         1.238 E+775,748,391,895  
  400         2.863 E+1,034,331,189,193  
  500         6.623 E+1,292,913,986,491  
  600         1.532 E+1,551,496,783,790  
  700         3.544 E+1,810,079,581,088  
  800         8.197 E+2,068,662,378,386  
  900         1.896 E+2,327,245,175,685  
# TiB # GiB # MiB # KiB bytes # bits Average number of tries / attempts = 2^(# bits)
  1,000         4.386 E+2,585,827,972,983  
1 1,024         1.776 E+2,647,887,844,335  
2 2,048         3.155 E+5,295,775,688,670  
 
 
 
 Introducing the maximum bit compression into the 7-letter word "example" analysis

For those of you who are familiar with using an 8-bit byte to represent 256 (2^8 = 256) printer symbols, you may be thinking that we should eliminate (compress out) many or all of the 256 printer symbols that are not letters of the English alphabet.

This leads to a compressed # bits = 4.7, instead of the uncompressed 8 bits in a byte (remember 2^(8) = 256).

If we define a compression factor for 26 symbols (CF = .587 554 964 767 637), it fits the following equations:

If we introduce a 12-letter word called "conventional", and compress its bits, then we are right back to approximately the same number of zeros as our uncompressed 7-letter word called "example": Therefore, using an uncompressed 7-letter word like "example", will give us approximately the same results as a compressed 12-letter word like "conventional". Consequently, compression, if possible, may somewhat reduce the number of zeros (in the exponent describing the number of tries or times), but usually not enough to change any conclusions we might draw from an analysis of uncompressed "bits of information".
 
Home  Events  Prophecies  Seeking  Bible  Index your Bible Prophecy website  888c.com  ^   <<  |< <=  => >|  >>