Thursday 24 May 2018

Paper Summary: Thermodynamics with continuous information flow

Paper Summary:
Thermodynamics with continuous information flow (2014) J M Horowitz and M Esposito

This paper discusses a way of quantifying the flows of information in the same way as flows of energy. Consider two random processes which hop between discrete states. Two independent systems of this type can be coupled together in a way so that the state of the first system, x, affects the transition propensities of the second system, y, and vice-versa. At any given instant, either or y could change, but crucially not both. This property defines the system as bipartite. The diagram below (taken from the paper) illustrates the creation of a bipartite system from and y.


Examples of systems of this type could include a ligand bound to a receptor which catalyses the creation of an activated complex. It is not possible for a ligand to bind to a receptor, and an activated complex to be created in a single step: they have to happen sequentially. However it is true that whether or not a ligand is bound to the receptor affects the rate with which activated complexes are formed.


If a system is bipartite all properties of the system to be decomposed into an x-subsystem component and the y-subsystem component, separating out the contributions from transitions involving the two subsystems. There are many useful properties of a system such as this, and for a rigorous thermodynamic analysis, a key one is the entropy. The second law of thermodynamics states that the entropy of the universe must increase with time. However this is only for the universe as a whole. On a more microscopic level it is possible for some parts of the system to decrease entropy, as long as other parts make up the shortfall. Traditionally we break the change in entropy down into the change in entropy of the system itself and the change of entropy in the environment due to the system. If we further decompose into separate contributions for our two subsystems an extra term appears for each,  quantifying the information flow around the system.



To the x component a term is added which takes into account how much information x learns about y during a single step. Similarly, to the y component a term is added which takes into account how much information y learns about x during a step. Splitting the entropy in this way gives a sense of how the information moves around the system; it flows from x->y or from y->x. Incorporating the information term allows us to apply the second law individually to each subsystem, rather than only at the level of the whole system. It also allows us to understand how a decrease in entropy in one subsystem can be caused by information flowing from one subsystem to the other, in a manner that is consistent with the second law.