Jump to content
Science Forums

Entropy, information, and complexity in biology


Larv

Recommended Posts

I am proposing a thread to discuss the various meanings of complexity, information, and entropy, as they pertain to biology. I am bringing these three terms under the same title because they are often compared, equivocated, and confused with one another. Perhaps it would serve a useful purpose of clarification to debate the separate and connected meanings of these terms.

 

As a point of departure for this discussion I will use a description of the most compact of all the structures in the universe: the black hole. Stephen Hawking, in his The Universe in a Nutshell (2001, p. 63), provides a relevant definition of the entropy of a black hole:

 

Not only does a black hole have a temperature, it also behaves as if it has a quantity called entropy. The entropy is a measure of the number of internal states (ways it could be configured on the inside) that the black hole could have without looking any different to an outside observer, who can only observe its mass, rotation, and charge.

 

Now, entropy has been famously likened to information, as well as to complexity. Claude Shannon is the founder of information theory. In addition to these formal definitions of entropy and information, there are equally formal definitions of complexity (for starters see this from the Sante Fe Institute.)

 

So, I’ll venture an hypothesis (full of straw) that suggests this combined definition of entropy, information, and complexity applies to biology, specifically to genomics—the study of genomes. And, by paraphrasing Hawking, I’ll put forward this concept for measuring entropy, information, and complexity in a genome:

 

Not only does a [genome] have a temperature [?], it also behaves as if it has a quantity called [complexity]. The [complexity] is a measure of the number of internal states (ways it could be configured on the inside) that the [genome] could have without looking any different to an outside observer, who can only observe its [phenotypes, genotypes, and sex ratios]. (I'm dangerously speculating on the last insertion.)

 

As such, the numbers and combinations of alleles would make all the difference in a genome’s complexity. And by equating entropy, information, and complexity with way, I have ruled out the popular myth that humans are the most complex organisms on this planet. There are many genomes more complex than the one we own.

 

Do this help any? Or does it further confuse the application of entropy, information, and complexity to the biological sciences? What are your favorite definitions of these terms?

Link to comment
Share on other sites

Although entropy is important to life, lowering entropy is the rule relative to the maintenance and evolution of life. Forming the DNA or proteins, from smaller molecules lowers entropy relative to the molecules. Life needed to evolve this entropy lowering mechanism. Attaching a molecule to a transport protein, lowers the entropy of a small molecule and even gives ordered direction to where it will go. Packing proteins for the DNA lower the entropy of the DNA. Even proofer reading enzymes get rid of the disorder within too much entropy in the DNA.

 

If you look at the energy curve of a substrate on an enzyme, the activation energy is lowered so it can react easier. Increasing entropy is endothermic. As such, the enzyme, by the lowering activation energy is lowering the energy that entropy of the substrate can use by itself. This is a good way to prevent other things from happening relative to the situation we add more energy for higher entropy in the substrate.

 

Evolving from RNA to DNA lowered entropy. This can be inferred from the relative stability of the two under a wide range of conditions, where the order in the DNA double helix remains very stable relative to RNA. Evolution could simply be life figuring out ways to lower the entropy.

 

Let me give an interesting example. The human mind allows complex language partly because of our memory. With memory being due to order in synapses, by simply increasing the entropy lowering ability of the brain, it also allows more long term stable memory.

Link to comment
Share on other sites

Life needed to evolve this entropy lowering mechanism.

HB, I think the opposite may be true: life evolved through "mechanisms" that increased entropy production. Or maybe it happened both ways...it's all theoretical anyway.

 

While I think you may be correct about entropy at the molecular level there is another level to consider when talking about biological systems. At the level of dissipative systems, wherein entropy is dissipated at an elevated rates under “far-from-equilibrium” conditions (thermodynamic equilibrium, that is). This means that living systems support themselves under “far-from equilibrium” conditions by consuming disproportionately higher amounts of energy and producing equivalently higher amounts of entropy. As such, large increases of entropy necessarily accompany biological systems.

 

Added by edit:

 

Watch this cage-of-doom motorcycle stunt video, which is not a bad representation of a dissipative system (motorcycles), operating under far-from-equilibrium conditions (with respect to at-rest gravitational conditions). There’s a whole lot of exhaust entropy being dissipated there.

Link to comment
Share on other sites

If you look at water, which is the primary component of life, it has an extremely high melting and boiling point and a very high thermal capacitance relative to any molecule close to its molecular weight. This is due to hydrogen bonding.

 

What the high MP and BP and the high thermal capacitance of water means, it takes a lot of energy to overcome the attractive forces within water. This inhibit high levels of entropy, in favor of much more order. This is important to life. Since water is 80-90 percent of life, it sets the background goal of life, which is to lower entropy, thereby allowing higher and higher level order.

Link to comment
Share on other sites

If you look at water, which is the primary component of life, it has an extremely high melting and boiling point and a very high thermal capacitance relative to any molecule close to its molecular weight. This is due to hydrogen bonding.

Well, hydrogen bonds are important to life, of course, but they're also important to dirt. So are covalent bonds and ionic bonds, but I fail to see your point here. Molecular bonding is not unique to life, nor does it explain the copious dissipation of entropy required to sustain life under far-from-equilibrium conditions.

 

What the high MP and BP and the high thermal capacitance of water means, it takes a lot of energy to overcome the attractive forces within water. This inhibit high levels of entropy, in favor of much more order. This is important to life. Since water is 80-90 percent of life, it sets the background goal of life, which is to lower entropy, thereby allowing higher and higher level order.

Apparently you and I have taken different biology classes, because I've never heard of a "background goal of life," unless it might be survival. The truth is that life does not have a "background goal." Life does not have a purpose or a mission; it is entirely natural and without any teleological features whatsoever. But life is in the business of energy consumption and entropy dissipation, both occurring at very high rates relative to those of non-living structures.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...