Jump to content
Science Forums

Does a genome have entropy


Larv

Recommended Posts

My definition of entropy has been posted in another thread: "Entropy is a measure of the number of internal states a system can have without looking any different to an outside observer."

 

My definition is a shameful plageriazation of Stephen Hawking's own definition of entropy, such as the entropy of a black hole: The entropy is a measure of the number of internal states (ways it could be configured on the inside) that a black hole could have without looking any different to an outside observer, who can only observe its mass, rotation, and charge. (2001, The Universe in a Nutshell, p. 63).

 

As such, why couldn't we say that a genome has entropy? A measure of a genome's number of states that it could be configured on the inside would be a measure of its active allelic combinations, because on the outside that genome will look like any undifferentiated member of its respective species. From the outside we can observe only its general features (I don't know if they would be equivalent to "mass, rotation, and charge," but I'm constantly tempted to speculate).

 

If we looked at a genome as a dissipative structure we might be able to effectively model its information structure, too. And if this were defensible then we could discuss genomes in meaningful terms that express important features like complexity, capacity, ascendancy, redundancy, and average mutual information.

 

(So far as I know, this is the first suggestion of how Shannon's information theory could be usefully applied to genomics. I'm still playing around with the idea.)

 

Do you think a genome could have (or dissipate) entropy as I have defined it above?

Link to comment
Share on other sites

'Course it has!

 

Quickie from Wiki:

The 3 billion base pairs of the haploid human genome correspond to an information content of about 750 megabytes. The entropy rate of the genome differs significantly between coding and non-coding sequences. It is close to the maximum of 2 bits per base pair for the coding sequences (about 45 million base pairs), and between 1.5 and 1.9 bits per base pair for each individual chromosome, except for the Y chromosome, which has an entropy rate below 0.9 bits per base pair.
Link to comment
Share on other sites

...It is close to the maximum of 2 bits per base pair for the coding sequences (about 45 million base pairs), and between 1.5 and 1.9 bits per base pair for each individual chromosome, except for the Y chromosome, which has an entropy rate below 0.9 bits per base pair.

Adding to this, note that a bit of “information entropy” equals [imath]k_B \ln 2 \dot \, = \, 9.96 \times 10^{-24} \,\mbox{J/K}[/imath]. In its extreme (ie: when information is stored with ideal physical efficiency), information entropy and physical entropy are equivalent.

 

What I find very counterintuitive is that the entropy of information, such as the genome that “builds” a complicated, structured organism, is greater than an equal length hypothetical “meaningless, blank” strand of DNA consisting the same base pair repeated throughout. As the number of possible states (ignoring non-gene-affecting, physical state differences) of a blank genome is exactly 1, its entropy is exactly [imath]\ln 1 = 0[/imath] (see and apply to this special case Gibbs entropy formula).

 

This leads to another very counterintuitive consequence: ideally, a genome, or any information storage system, produces work (energy) when information is stored in it, and consumes it when information is erased. This is almost exactly the opposite of what our intuition, drawn from experience in which we turn on such things as electronic computers to store information, and can erase it by turning them off, tells us – but is clearly, demonstrably true.

Link to comment
Share on other sites

Good responses here and I appreciate them. I would have to call that kind of "entropy entropy in the code," which is interesting. But in my OP I was looking for a different measure of genomic entropy, other than the digital bits and bytes. I'm saying that the entropy of a genome could be expressed by the number of allelic permutations possible within that genome. For example, a genome that has no alleles for any of its genes (unlikely) would have an entropy of zero.

Link to comment
Share on other sites

Good responses here and I appreciate them. I would have to call that kind of "entropy entropy in the code," which is interesting. But in my OP I was looking for a different measure of genomic entropy, other than the digital bits and bytes. I'm saying that the entropy of a genome could be expressed by the number of allelic permutations possible within that genome. For example, a genome that has no alleles for any of its genes (unlikely) would have an entropy of zero.

 

Saying ("entropy entropy in the code") is the same as referring to a subsystem. In other words, to make a long story short, the net result is that a system described by its time-oriented changes of state as a result of both spontaneous dynamics of the system, and as a result of interactions between its subsystems leads to that which is observed: Life and its exceedingly divers evolutionary trends.

 

So life (and its observed intrinsic diversity), in another way, is the result of dynamic interactions and spontaneous changes of subsystems that ultimately induce observable macro-state differences in the system, in accord with the second law of thermodynamics, i.e., the non-decrease of entropy.

 

Yes, then, a genome has entropy. That entropy would not stop there though. Given two quantum states inside a particular cell, ρ and σ, the von Neumann entropies are consistent with S(ρ) and S(σ). That entropy measures how uncertain we are about the particular value of the state (how mixed is the actual state). The combined quantum entropy S(ρ,σ) measures our uncertainty about the combined system which contains both states. Analogously, with the classical conditional entropy it can be defined the conditional quantum entropy.

 

Accordingly, and perhaps non-intuitively, the entropy can be negative: see conditional quantum entropy.

 

However, this seems to make no physical sense. On the one hand entropy is a nondecreasing property, and on the other it can be negative (at least within subsystems). If we turn towards the quantum relative entropy: a measure of our ability to distinguish two or more quantum states, we see that the orthogonal quantum states can always be distinguished one from the other, through projective measurement. In this context, non-finite quantum relative entropy would be the rule, not the exception, operational with cells (within subsystems and reflecting in the ovseral system).

 

The entropy of a system depends on the number of particular quantum states (microstates) which are consistent with the macroscopic state of the system.

 

Food for thought. :eek_big:

 

 

CC

Link to comment
Share on other sites

The entropy of a system depends on the number of particular quantum states (microstates) which are consistent with the macroscopic state of the system.

How about this? The entropy of a genomic system depends on the number of particular quantum states (microstates or allelic combinations) which are consistent with the macroscopic state of the system (otherwise observed as the genomic species).

Link to comment
Share on other sites

How about this? The entropy of a genomic system....

Everything "has" entropy... at whatever level you choose to set the limits of observation.

 

I'm not sure the entropy of coding or "information" competes on the same material level as the entropy of molecular structures (electronic, steric considerations), but if the information is related to those structures, I suppose it would be a small competing factor.

 

I don't think the small decreases in the entropy of little areas of the physical genome amount to much when the large increases in entropy - generated by the subsystems, informed by the genome, as Coldcreation pointed out so well - are considered.

 

The highly-ordered genome (a microsystem) orchestrates a highly entropic, living, macro-system. In the end, living systems generate a lot more entropy than they "consume."

 

Life is just sorta like an enzyme... for accelerating entropic decay.

Link to comment
Share on other sites

In the end, living systems generate a lot more entropy than they "consume."

I don't know what you mean by "consume." But here's a revision of your sentence that I can agree with: In the end, living systems generate a lot more entropy than they would if they were inert systems participating only in equilibrum thermodynamics.

 

Life is just sorta like an enzyme... for accelerating entropic decay.

Here's shorter way of saying it: Life is a dissipative structure.

Link to comment
Share on other sites

I don't know what you mean by "consume." But here's a revision of your sentence that I can agree with: In the end, living systems generate a lot more entropy than they would if they were inert systems participating only in equilibrum thermodynamics.

 

Here's shorter way of saying it: Life is a dissipative structure.

Yep, it's the equilibrium part that is key....

===

 

[i just meant lowering entropy, by saying "consume" entropy.]

 

You mentioned Clausius' definition.

"Entropy cannot be destroyed. It is a property of all matter. Depending on the type of system, entropy either remains the same or increases with time."

I think that Clausius definition applies only to theoretical states - isolated systems at equilibrium - or the universe as a whole.

But in these far-from-equilibrium states like life, exceptions to the "never decreasing" part are common.

 

I think life is more than just a dissipative structure. It's a dissipative architecture of dissipative structures of dissipative microsystems of.... ...and evolving into dissipative collectives of dissipative architectures of....

[i'm putting cells at the level of dissipative architectures, more or less; but yes, life is just a complex dissipative system.]

 

But the higher levels of organization and complexity in life's dissipative structures enables a larger, more efficient production of entropy from life's lower organizational levels of dissipative structures - more entropy than the sum of the parts would generate individually - life magnifying entropy's increase. Or is it the other way around - life slows and channels the increase in entropy? Hmmmm, it's too late to think about this anymore.

 

And then there's social dissipative systems of unitary (creature level) dissipative structures of....

Link to comment
Share on other sites

 

The highly-ordered genome (a microsystem) orchestrates a highly entropic, living, macro-system. In the end, living systems generate a lot more entropy than they "consume."

 

I don't know what you mean by "consume." But here's a revision of your sentence that I can agree with: In the end, living systems generate a lot more entropy than they would if they were inert systems participating only in equilibrum thermodynamics.

 

Here's shorter way of saying it: Life is a dissipative structure.

 

Any tenable depiction of a spontaneous changes in the entropy of a system must be related to the macroscopic dispersal of energy (or to an increase in the number of accessible microstates, on a molecular scale).

 

So I think what Essay implied (giving him the benefit of doubt) was that "living systems generate a lot more entropy than they "consume" energy. That could be why consume is in quotes. Energy is not actually 'consumed' though. It is more of a dispersal, or dissipative process.

 

On the other hand, Essay might be referring to the 'consuming' of entropy, as opposed to the 'accumulation' of entropy (the increase in entropy), i.e., that entropy is a nondecreasing property. So that would be equivalent to; 'living systems increase entropy more than they reduce it.'

 

Both concept work in that sentence thank to the quotes, it's just more complete with the property of energy included.

 

 

CC

Link to comment
Share on other sites

On the other hand, Essay might be referring to the 'consuming' of entropy, as opposed to the 'accumulation' of entropy (the increase in entropy), i.e., that entropy is a nondecreasing property. So that would be equivalent to; 'living systems increase entropy more than they reduce it.'

 

Both concept work in that sentence thank to the quotes, it's just more complete with the property of energy included.

CC

Thanks, it's the second reading that I was driving at. I didn't want to get into the total energy equaling the enthalpy plus the entropy (or is it minus the entropy?).

 

~ :thumbs_up

Link to comment
Share on other sites

So I think what Essay implied (giving him the befit of doubt) was that "living systems generate a lot more entropy than they "consume" energy.

Huh? How do they do that?

 

That could be why consume is in quotes. Energy is not actually 'consumed' though. It is more of a dispersal, or dissipative process.

No, it is energy that is consumed, not entropy. And what is dissipated is not energy but entropy.

Link to comment
Share on other sites

If we took DNA and dissolved it in water, the DNA double helix will have many degrees of molecular freedom or entropy where it can bend, loop, wiggle, etc. If we added first order packing proteins, these will limit the degrees of molecular freedom of the DNA, as the DNA winds 3.5 times around each histone ball. There is still molecular entropy within the DNA, but less than when there was no packing. As we increase the order of packing, all the way to condensed chromosomes, the degrees of freedom or the molecular entropy of the DNA decreases with each level of packing.

 

Since DNA needs to unpack before it can become active, this implies that active DNA has higher entropy than packed DNA. The more average unpacked the DNA, the higher will be the average entropy of the DNA. This implies that the duplication of the DNA requires the highest level of entropy within the DNA.

Link to comment
Share on other sites

Huh? How do they do that?

 

No, it is energy that is consumed, not entropy. And what is dissipated is not energy but entropy.

 

Forget the word "consume."

 

Any tenable depiction of a spontaneous changes in the entropy of a system must be related to the macroscopic dispersal of energy, or to an increase in the number of accessible microstates, on a molecular scale.

 

CC

Link to comment
Share on other sites

Any tenable depiction of a spontaneous changes in the entropy of a system must be related to the macroscopic dispersal of energy...
Pardon me for being dense, but I just don't understand this.

 

...or to an increase in the number of accessible microstates, on a molecular scale.

Why does it have to be on a molecular scale?

Link to comment
Share on other sites

Pardon me for being dense, but I just don't understand this.

 

Why does it have to be on a molecular scale?

 

The following link posted by JMJones0424 in the "What is Entropy" thread explains the situation quite well:

 

Disorder - A Cracked Crutch for Supporting Entropy Discussions

 

You'll find pretty mush that exact sentence in there. Look at the last paragraph in the section entitled "The Molecular Basis of Thermodynamics." The entire text is very interesting.

 

CC

Link to comment
Share on other sites

Here is another genome entropy consideration. It has to due with the entropy within the genetic information transfer. The easiest way to understand this is to first consider an analogy, "I heard it through the grapevine". In a gossip grapevine, information within the original event will increase entropy, at each junction within the vine, as each person in the vines leaves out, adds or embellishes the information they receive. The increasing entropy is connected to the spontaneous changes within the transmission. It is not lossless transmission, but will contain entropy or disorder with respect to the order of the reality event of the initial transmission.

 

Relative to the DNA and genome, the genetic information grapevine starts with the information in a gene. This is transferred to RNA, than to protein, then transported to the final destination, where the information is implemented in the field, often in very specific ways. The cell is actually pretty good with genetic information transfer maintaining relatively low entropy.

 

Consider a hypothetical cell that acts more like a gossip grapevine, or increases entropy within its genetic information transfer steps. The gene (Joe) tells the information to the RNA (Sue). She ignores some of the data and adds some of her own to make the story more interesting. She then tells (Alan) or the protein, and he too adds creative input for further entropy change within the information. Instead of Carl winning $10 in a scratch ticket, he now has quit his job and bought a new house with his massive winnings. Although this would be a better result compared to reality of $10, since it is not real, the cell would be like a Don Quixote losing touch with cause and effect, as it reacts to stimulus that are not even there.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...