Free Energy and Entropy

A progressive and unrelenting increase in disorder characterizes the nature of our world (and universe) as we understand it. Naturally, spontaneous processes exhibit an intrinsic tendency towards wasting or dissipating useful (free) energy. By minimizing inefficiencies in the misallocation and misuse of Free Energy, biological systems maximize their ability to antagonize disorder and reverse entropy.

Revisiting the Hierarchy of Sciences…

…at some point in time the study of Entropy began with Randomness, which was explored as probability and chance since those times in history.

Within thermodynamic systems, Gibbs Free Energy was studied in chemical reactions while Helmholtz Free Energy was the analogue in explosive reactions. The understanding evolved to encompass particle statistics, probabilities of micro and macro states, and their equilibria in statistical thermodynamics/mechanics...

Basically, the study of Entropy went from chance and randomness to phases of matter (solid, liquid, gas) to information particles and statistical probabilities.

…and eventually in the form of Information Entropy, following the publication of A Mathematical Theory of Communication by Claude Shannon in the 1940s, wherein he analyzed the statistics of lost information, or uncertainty. This uncertainty related to outcomes came to be measured with binary digitsbits.

A fair coin will have an Information Entropy of 1 binary digit (bit) per flip, because the outcome will be either Heads = 0, or Tails = 1.

On the other hand, a fixed coin will have no Information Entropy – 0 bits per flip – because the outcome is already known, so no uncertainty exists.

No uncertainty --> no anticipation. Is this why the market will buy the rumor, sell the news?

Subsequent advances in Information Theory were combined with the field of Probability to spawn a range of entropy types such as conditional entropy, joint entropy, and so forth…along with another kind of Free Energy.

Last updated