wow…but I think I prefer the legends LOL.
Yeah, facts are uncomfortable, right?
It isn’t impossible. In fact it is very likely outcome if you have 10^52 (or more) trials. This is something that is often overlooked by the “fine tuners”. A 1 in 10^50 would be very unlikely if you had one a single trial; and that is why they only consider a single trial; instead of a trial every fraction of a second, at every location all throughout time (where appropriate). Without taking this into consideration a 1 in 10^50 outcome might never happen or might happen 10^100 times a day. You don’t know this value no one does; instead you just used the number 1 because it gave you the answer you wanted. The probability you listed is a fiction.
Don’t be that up yourself…my remark was very much tongue in cheek.
Ditto.
20 characters.
Just a rehash of the argument from complexity or whatever it’s called.
Regardless, it doesn’t prove the existence of any deity, all it says is we’ve got some investigating to do (that being what open minds do)
UK Atheist
What I find interesting about the idea of complexity… “Complexity is an emergent property of systems following simple rules.” How complex is the bonding of two hydrogen atoms to one of oxygen. It seems to me that the entire universe is based on simple chemistry that we have, as yet, not quite understood and may not understand. After all, our physics is not the physics that occurs prior to Planck time. Is it complexity or simply a lack of comprehension? Why do we get to call that which we do not yet understand ‘complex?’
I always wonder about the argument from complexity? The universe didn’t start out complex, it may have started out with only a scalar energy field that drove inflation and with unified forces.
We understand why it is complex and how it got that way. Nothing to do with God or gods. Everything to do with thermodynamics. Am I wrong about this?
Crudely is seems increases in entropy are associated with increases in complexity. Pretty much everyone agrees that low entropy means low complexity. Pretty much everyone agrees that when you then increase entropy, complexity will increase. When we start to consider the behavior near the end of the spectrum, when entropy is approaching its maximum (i.e. you are in danger of creating a black hole) the models become harder and harder to verify (test) and sometimes seem to create contradictions. So when the entropy is huge: who knows.
The conflation of entropy with complexity is sometimes naive and non-rigorous. The scientific literature documents, for example, that assembly of phospholipids into ordered structures is associated counter-intuitively with an increase in entropy, the reverse of what the naive would expect.
Since disordered microstates require more information to describe than ordered states, the Kolmogorov complexity of the former is greater than that of the latter, and if entropy were synonymous with Kolmogorov complexity, it would always increase whenever a system moved to a more disordered state.
Phospholipid self-assembly into micelles, bilayer sheets and liposomes constitute a documented counterexample.
I never thought about information theory through entropy. My naive thoughts at this point are strongly against that proposition, however time will tell.
Information theory and entropy?
I’ve had this idea that game theory and information theory can both be applied to evolution in a manner that describes increasing “complexity” (which I view as how many “bits” of genetic information per unit organism) in a mathematically rigorous way . . . but I have a very long way to go before feel like I can run my mouth on the subject (I’m reviewing basic probability theory now, and then I need to get caught up on specific aspects of game theory that I’m not clear about).
For all I know, these ideas may have already been explored . . . and I could look like a crackpot.
The good news is, no, you’re not a crackpot for indulging in such thoughts. You’ll find a brace of scientific papers by Thomas D. Schneider on this very topic.
Just thinking out loud:
The Shannon information contained in a system is the log(base 2) of the number of possible states of that system (in bits). I suppose you could try using something (silly?) like the following:
4 states per base pair (A, C, T, G) and 3 billion base pairs:
log_2[4^(3*10^9)] = 3*(10^9)*log_2[4] = 6 billion bits.
Now that I think about it, that makes perfect sense; 2 bits per base pair, heh I guess I did it the hard way. It is 2 bits per pair because there are 4 possible states and you can make 4 numbers out of 2 digits of binary (2 bits): 00, 01, 10, 11. How useful this is…I don’t know.
Thank you very much.
Thank you. I still have a long way to go on the subject.
Well, the problem is that the current understanding of how it naturally originated is probably flawed because it leads to an impossibility. It doesn’t mean there is no natural explanation, but undoubtedly our understanding of this subject is still deficient.
Hallelujah…though you might also note that we know natural phenomena exist as an objective fact, and that the “20,000 proteins and enzymes inside us.” in the OP exists as an objective fact, adding inexplicable magic from an unevidenced deity quite obviously violates Occam’s razor.
The existence of a universal deity is not a question of biology but rather a matter of consciousness. When attempting to explain consciousness (no need to reiterate what I’m referring to), the ideas often lead to the concept of a universal sentience. In fact, we ourselves are evidence that such a possibility is feasible.
You might as well be claiming that mermaids are slippery.

When attempting to explain consciousness (no need to reiterate what I’m referring to), the ideas often lead to the concept of a universal sentience.
No they don’t, that’s your subjective opinion. The overwhelming objective evidence demonstrates that consciousness is most likely an emergent property of a physical brain. However even were this not the case, this would not objectively evidence any deity or anything supernatural.

In fact, we ourselves are evidence that such a possibility is feasible.
I don’t believe you.

The largest of these is titin, in our muscles, composed of 33,450 amino acid residues, all in L form, all with peptide bonds, hence their name, “polypeptides.”
Why start with the largest? Why not the smallest? In evolution, things usually start small and grow over time through numerous trials and extinctions.

which works out to about 1 chance in 10 to the 65,000th power.
That’s if you start the sequence from scratch - a gimmick that creationists like to use. An example is a monkey on a typewriter reproducing Hamlet. A more correct way to see this is a million monkeys on typewriters reproducing the first sentence of the bible across 1,000 years. Lots and lots of failures but one that is correct. Repeat for the second sentence only this time, we use 2 million monkeys.

You don’t get an infinite number of tries.
In 10 billion years among 1 trillion trillion planets you do.

You must explain how each small segment somehow benefits the host and how he can then do without it.
Viruses explain it. Prior to the first living cell, protobionts with simple nonfunctioning genomes merged with each other through trillions and trillions of nonsense combinations until one eventually produced the first replicating cell. We know that viruses played a large role in evolution because of introns - small segments in all DNA that are leftover pieces of viruses that accidentally fused some of its genes into the host and became a hereditary fossil. Approximately 8% of the human DNA is made from viruses.