Entropy: From 19th Century Steam Engines to 21st Century Cell Biology
Earlier posts on Cancer Ecology Commentary explored
the role of epigenetic regulation as a window for understanding lineage
plasticity, the ability for cells within an organ to reverse their fate either
as a compensatory response in the face of injury to facilitate repair but
risking the potentially deleterious shift towards non-homeostatic phenotypes and
behaviors which oncologists refer to as the cancer hallmarks. Our
understanding of this disrupted regulation was a product of epigenetic
entropy, a form of information entropy as first described by Claude Shannon
- the inevitable loss of accurate information obscured by noise occurring
during information transfer. From the perspective of lineage plasticity, epigenetic
entropy lowered the barrier for misdirected and potentially pathologic lineage
transition.
Given the importance of entropy in understanding epigenetics’
role in carcinogenesis, it might be useful to spend some time studying this property
from the point of view of a physicist. One such exploration comes from Brian
Greene, professor of physics and mathematics at Columbia University and author
of Until the End of Time (Alfred A. Knopf, New York, 2020). This
publication describes how our cosmos evolved, the circumstances promoting the
origin of life and how it will be that our universe will someday come to an end,
all occurring as a consequence of the laws of physics and math. Central to this
description is the second law of thermodynamics which Greene informally summarizes
as the “unavoidable production of waste”.
To help us better understand the nature of entropy, Greene
describes the historical context leading to the Second Law of Thermodynamics beginning
in the first portion of the 19th century, the dawn of the industrial
revolution. At the time there was a quest to better understand the flow of
energy powering what was then the revolutionary development of the steam
engine. At its simplest, the engine consists of a cylinder housing water vapor and
fitted with a piston. When heat is applied, the cylinder absorbs thermal energy
causing the water vapor to expand, pushing the piston outward. In the recovery
phase the cylinder releases heated vapor allowing the piston to return to its original
position. Although the nature of thermal energy flow was misunderstood then, we
now understand this transfer in terms of kinetic energy. A way to understand
this is to consider heated microscopic gas particles gaining kinetic energy
from the energy released by combustion. When those energetic microscopic
particles come in contact with the cylinder, microscopic particles of the
cylinder and the contained water vapor increase their kinetic energy as well.
Microscopic gas and water vapor particles with increasing kinetic energy,
however, are also marked by increasing entropy.
The reason for this is that microscopic particles with
increasing kinetic energy are able to adopt an increasing number of microscopic
states which are indistinguishable from each other. The number of
indistinguishable microscopic states within a macroscopic system reflects the
level of the interchangeability of the microscopic states comprising that
system. Up until the latter portion of the 19th century, physicists,
interested in describing real life phenomena were limited to the use of
Newtonian mechanics, tracing the direction and momentum of individual
microscopic particles based on Newton’s second law of motion- the force acting
on an object is the product of that object’s mass and the acceleration it was
experiencing. To predict the behavior of
a system it was necessary to account for the motion of all the components contained
within a system. This works well to predict the course of billiard balls
colliding on a billiard table but would paralyze the physicist trying to account
for the millions of microscopic kinetic interactions comprising a macroscopic
state. A breakthrough occurred when Austrian physicist Ludwig Boltzmann
realized that by looking at the average behavior of the microscopic particles,
for the steam engine the temperature, pressure and volume of the cylinder, one
could obtain to a good approximation the behavior of the macroscopic system, in
our case the energetics of the steam engine.
For Boltzmann it was thus the number of interchangeable
microscopic states leaving the macroscopic state unchanged that would define
the level of entropy of a system, what is meant by ‘Statisical
Thermodynamics’. For example, at the start of the steam engine’s cycle the
cylinder has low kinetic energy, a reduced volume of water vapor with an
accompanying low amount of internal pressure. At that moment there are fewer ways
of rearranging the microscopic water particles to leave the cylinder unchanged.
The engine then begins the cycle with low energy and low entropy. As the cylinder
absorbs heat in the form of kinetic energy its volume expands and its pressure
rises together with a corresponding increase in the number of ways in which
microscopic particles can be rearranged, thereby gaining an increase in
entropy. As the engine absorbs increasing thermal (kinetic) energy, it absorbs
increasing entropy at well. Importantly, for the engine to return to its
initial position associated with low kinetic energy and entropy it must release
into the environment waste heat along with accompanying entropy in order to
complete the cycle. If there was a failure to release waste heat, the cylinder
would be arrested at mid-cycle. The engine would not be able to return to its
initial state as the internal temperature, pressure and volume would remain
unchanged.
Greene summarizes this cycle as the “entropic two-step”,
any process of energy transfer characterized by the initial absorption of
energy and entropy from the environment allowing useful work to be performed,
in his example forcing the piston forward under increasing pressure. In a
needed second step there is then release of energy and entropy back to the environment.
If we measure the energy, entropy balance sheet over the course of the cycle we
find first that the total energy, that in the form of thermal energy absorbed
by the engine, transferred to the piston in the form of acceleration through a
distance (force) and then released back into the environment, is conserved (the
First Law of Thermodynamics). The entropy,
however, received from and then released back into the environment, despite the
initial low entropy state of the engine’s state, will net increase, consistent
with the second law.
In his publication Greene uses this cyclic flow of increasing
and then decreasing energy and entropy to describe the energetics of other
macroscopic systems, notably life forms. Using the example of the eukaryotic
cells which couple the chemical oxidation of covalent bonds to transfer
chemical energy in a stepwise sequence transferring that energy in a series of
reduction reactions. The reduced intermediates of this sequence provide
chemical energy to form new high energy covalent bonds, which could be thought of
as a type cellular chemical battery, available to power useful cellular work.
Like the steam engine though, the cell maintains an orderly low entropy state characteristic
of life, often for long periods of time but always at the expense of releasing waste
heat and entropy back into the surrounding environment, our universe, where it
lives.
Greene uses a heuristic to help us understand the probabilistic
nature of entropy by examining the outcome of a bag of pennies released onto a
table. Imagine one such disbursement in which all of the pennies landed in the heads-up
position. Such a result would be exceedingly rare, requiring all of the pennies
to land in one configuration, the singular way for the pennies to land after
the toss to account for that rare observation. If another toss led to only one
penny landing tails-up with the other 99 landing heads-up that would be 100
times more likely than when all of the 100 pennies landed on heads. For that toss any one of the one-hundred pennies
could land tails-up to account for that distribution. Extending this example,
with two tails-up the chances would be about 5000 times more likely than no tails-up.
For three tails-up, chances grow to more than 150,000 times likely. Continuing
this sequence, if a toss resulted in a 50/50 distribution of heads and tails,
this would be 1029 times more likely to occur than the all-heads-up
toss. In the latter situation there would be innumerable ways in which the 100
pennies might land leaving the 50/50 distribution unchanged compared to the singular
way in which an all-heads distribution would occur. The all-heads toss, being
exceedingly rare, would be highly surprising. 50/50 heads to tails is
overwhelming likely to occur and thus would be entirely expected. The
surprising result of events occurring with low likelihood can be considered as
having low entropy. They are unique, special, less common. Events leading to
expected outcomes are not surprising. They’re common and thus of high entropy. On
a statistical basis, events marked by high entropy are much more likely to
occur than low entropy events. In our universe, as event after event occurs along
the time course of the cosmos, higher entropy states will always be more likely
to be observed than those with low entropy. The more events, the higher the
chances are for increased entropy, a way of restating the second law of
thermodynamics.
In his book, Greene tackles an important implication from
the 2nd law, that the early universe must have originally been in a
state of low entropy. He describes how this might have occurred at the time of
the ‘Big Bang’ and by doing so develops a model for how this event would then
have propagated explained through the effects of quantum oscillations and
gravity leading to mass agglomeration and then nuclear fusion. I encourage the
interested reader to pursue this further, which I believe you’ll find rewarding.
With our new probabilistic understanding of entropy, we can take
a look back at our earlier discussion of epigenetic entropy to better
understand how such a complex, low entropy system as the epigenome, bursting
with accurate information to govern gene network expression will be subject by
chance alone to corruption of that information set. A highly complex macroscopic
system characteristic of the epigenome, charged with directing development and controlling
gene expression, rely on unique sets of epigenomic configurations, the DNA base
methylations and chromatin marks, for reliable homeostatic cell behavior. Each
time that the epigenome is ‘opened’ – the transition from heterochromatin to
euchromatin within the cell’s nucleus- it will be vulnerable to untoward
perturbations causing, by chance alone new configurations to appear. There will
be a natural tendency, governed by the 2nd law, for the number of
types of microscopic patterns of DNA methylation and histone marks to increase
in number, some of which may prove oncogenic. Like a toss of a bag of
one-hundred pennies, each cell division, each step in organ development and
repair, or each episode of genetic reprogramming will risk the occurrence of
higher entropy patterns degrading the fidelity of information transfer. Over
time disorganization will always be more likely than organization. As a corollary,
the more often organ repair through lineage plasticity is required for example in
response to environmental stress, or the more often there is a need for a hyperplasic
response to injury marked by increasing cell mitosis, then there will be
accompanying higher chance for a low entropy epigenome to be transformed to
that with higher entropy.
Acknowledging the danger of extrapolating scientific
principles into the realm of the social and the political, might it be that the
fragmentation we observe now of our American body politic represent an example
of this unceasing rise in ‘social entropy’. As trust erodes and faith in collective
action diminishes, fed by among other things the toxic influence of social
media, we experience a dissolution of consensus and compromise. On the other hand,
it might also be that a trustworthy leader, emulating the entropic two-step could
infuse the electorate with the confidence we seek in our communities and one
another, the political equivalent of transferring into our system ‘useful’ energy
to gain each other’s trust and thereby lowering our own social entropy.
For an upcoming post later this summer, Cancer Ecology
Commentary is going to shift from the subject of molecular biology to begin
a series of posts examining the role of biological diversity. An initial post
in August will begin at the community level investigating strategies for
creating prosocial behavior across a heterogenous society’s many subgroups. To
do this we will discuss a review by sociologists Delia Baldessari and Maria
Abascal from New York University in conjunction with a summary of the work of
public intellectual Robert Putnam, author of Bowling Alone.
Comments
Post a Comment