Of course, there are a number of issues with this. You need to be confident about your theory that relates measurements to equations, and you also need to be confident about the measurements themselves. But perhaps the most important caveat is that the equations that you can solve by performing measurements are often only interesting in predicting the outcome of the measurements themselves; the whole thing becomes rather incestuous and not obviously useful.

In a recent paper (here), Manoj Gopalkrishnan shows how the the complexity of a single chemical reaction can be harnessed to perform a useful computation. The paper is quite involved, but the essence is the following. Let's say I have the chemical species, X

_{1}, X

_{2}and X

_{3}that can interconvert by the reaction X

_{1}+ X

_{3}⇌ 2 X

_{2}, with both left and right sides being equally stable (such a system could, in principle, be created from DNA). If I start with a certain initial concentration of molecules, x

_{1}, x

_{2}and x

_{3}, the system will evolve to reach some equilibrium in the long time limit, x'

_{1}, x'

_{2}and x'

_{3}. This equilibrium represents a maximization of the system's entropy, subject to the constraint that (x

_{1},x

_{2},x

_{3}) can only be changed via the X

_{1}+ X

_{3}⇌ 2 X

_{2}reaction.

Dr Gopalkrishnan shows that the constrained optimization performed by the chemical reaction solves a problem of wider interest. Let's say we have a random variable that can take three distinct values y

_{1}, y

_{2}and y

_{3}(in essence, a three-sided die). This die might be biased, with the probabilities of seeing each side not equal to 1/3. Further, we might have a reason to think that the probabilities are constrained by underlying physics, so that only certain combinations of probabilities p(y

_{1}), p(y

_{2}) and p(y

_{3}) are possible. So we know something about our die, but not the specifics. If someone rolls the die several times and presents us with the results, can we estimate the most likely values of p(y

_{1}), p(y

_{2}) and p(y

_{3})?

Dr Gopalkrishnan shows that, for a certain very general type of constraint on probabilities (a "log-linear model"), the procedure for finding this "maximum likelihood estimate" of p(y

_{1}), p(y

_{2}) and p(y

_{3}) is

*exactly*identical to that performed by an appropriate chemical system in maximizing its entropy under constrained particle numbers. Different log-linear constraints on p(y

_{1}), p(y

_{2}) and p(y

_{3}) translate into different choices of the reaction, which doesn't have to be X

_{1}+ X

_{3}⇌ 2 X

_{2}. Given the appropriate reaction, all we have to do is feed in the data (the results of the die rolls) as the initial conditions (x

_{1},x

_{2},x

_{3}), and the eventual steady state (x'

_{1},x'

_{2},x'

_{3}) gives us our estimate of (p(y

_{1}),p(y

_{2}),p(y

_{3})).

In principle, this argument generalizes to much more complex systems than the illustration provided here, and the principle of maximum likelihood estimation isn't only applicable to biased dice. Of course, this is a long way from a physical implementation, and even further from an actual useful device, but it does illustrate the potential of harnessing the complexity of physical systems rather than trying to force them to approximate digital electronics. Going further in this direction, the chemical system will fluctuate about its steady-state average (x'

_{1},x'

_{2},x'

_{3}); it may be that these fluctuations can be directly related to our uncertainty in our estimate of (p(y

_{1}),p(y

_{2}),p(y

_{3})).*

*In detail, the distribution of (x

_{1},x

_{2},x

_{3}) in the steady state may be related to the posterior probability of (p(y

_{1}),p(y

_{2}),p(y

_{3})) given a flat prior and the data.

## No comments:

## Post a Comment