
Originally Posted by
CGM

Originally Posted by
Von Milash
CGM, no disrespect, but you're talking so far out of your ass it's pathetic. In fact, you have very eloquently said absolutely nothing.
you can't randomly or spontaneously form DNA, or any other life bearing molecule.
OK man, you have it your way. But if you think entropy rules out evolution you are pathetically misinformed, no disrespect.

What this one?
Entropy (information theory)
From Wikipedia, the free encyclopedia
(Redirected from
Information entropy)
Jump to:
navigation,
search
This article is about the Shannon entropy in
information theory. For other uses, see
Entropy (disambiguation).
This article incorporates material from Shannon's entropy on PlanetMath, which is licensed under the GFDL.
In
information theory,
entropy (sometimes known as
self-information) is a measure of the uncertainty associated with a
random variable. The term by itself in this context usually refers to the
Shannon entropy, which quantifies, in the sense of an
expected value, the
information contained in a message, usually in units such as
bits. Equivalently, the Shannon entropy is a measure of the average
information content one is missing when one does not know the value of the random variable. The concept was introduced by
Claude E. Shannon in his 1948 paper "
A Mathematical Theory of Communication".
Why cant things co exist ? Why cant thesis co exist?Which leads to the real question why we as people cannot yet fully co exist >
why couldnt the universe be created by divine love frequencies or a God if you need a name; then left to evolve as we surly have?
Bookmarks