Kinetic roughening of the urban skyline
A neat little paper showcasing an application of a statistical mechanical model to city skylines. Kinetic roughening is a nice example of a simple model that showcases some of the main themes of modern statistical mechanics including scaling and continuum limits. These models, in the discrete regime, consist of a lattice of height vectors which evolve in time due to some rules relating them to their neighbours. The roughness is defined as the root mean square of the heights. This roughness displays scaling behaviour, in particular for this paper, the roughness scales as the system size to some power after sufficient time passes. This paper looks at a huge database of 10^7 buildings in the Netherlands and calculates this saturation exponent for many cities. Where there is significant enough data, they find that the cities can be grouped into two sets with different exponents. These exponents correspond to the two main universality classes for roughening models. Finally, they remark that it would be interesting to consider the buildings regulations for each city to explain why they would fall into a given universality class.
Characterization and mitigation of gene expression burden in mammalian cells
In this work, the authors investigate the burden imposed by synthetic circuits in mammalian cells and study transcriptional and translational burden caused by cellular resource sharing. They were able to mitigate the effects of resource limitations using a microRNA-based incoherent feedforward loop (iFFL) motif. They concluded that using burden-aware designs, synthetic circuits that rely on perturbations will be able to show more accuracy and predictability.
A basic introduction to large deviations: Theory, applications, simulations
This approachable set of lecture notes was written following a 2009 paper reviewing the theory of large deviations (or LDT). The techniques of LDT provide a framework for a rigorous formulation of statistical mechanics. Most physicists and engineers have used the techniques of LDT at some point or another, but might not have been aware that this was the case! In essence, LDT is used to quantify the rate at which the probability that the sample mean S (of a set of samples of n independent identically distributed random variables each with mean u) is exponentially suppressed with increasing sample size n, where S is not equal to the mean u. This is a generalisation of the central limit theorem.
Programmed spatial organization of biomacromolecules into discrete, coacervate-based protocells
The coacervate created in this work was able to recruit his-tagged enzymes with via interaction with Nickel ion in the protocell. The increased concentration of enzymes led to acceleration in the enzymatic process. The proteins could be released by cleaving specific site as well.
DNA programmed chemical synthesis of polymers and inorganic materials
The present review offers a general perspective on how DNA programmable interactions have been exploited in the field of chemical synthesis, first referring to the possibility of using directed interactions to perform directed polymer synthesis analogously to how copying polymers work in living cells, as well as how DNA can direct the conjugation and directed arrangement of different polymeric materials with aims as diverse as functionalisation for in vivo applications or directed assembly of conducting polymers as well as the assembly of metallic nanoparticles in prescribed arrangements that can have applications for plasmon resonance-based sensor applications. While showing that DNA nanotechnology is a powerful tool for material science-based nanotechnology applications, this review fails to specify which challenges face the field in order to be applied to other other fields (such as MOF, COF or polyoxometalate synthesis) in which the chance to implement programmable interactions could become a paradigm shift.
Emergence of low-symmetry foldamers from single monomers
Molecular self-assembly of simpler components often give rise to complex features in dynamic combinatorial libraries. In this article, the authors describe the emergence of large molecules with low symmetry unlike most previously described systems. When the monomers, capable of forming disulfide bonds among themselves, are equilibrated for several days, different libraries show the formation of predominantly one product, or a very small family of products. In some cases, large molecules were generated with very low symmetry (17 or 23 monomers). Further analysis by ion mobility mass spectrometry, NMR and CD spectroscopy showed that these large molecules are not simple 2D circular molecules, rather they all have unique complex folded structures. This was confirmed by the X-ray crystal structures of the synthesised molecules. The authors conclude that the semi-rigid backbone structures of the molecules, and the presence of several diverse sites for non-covalent interaction, which can overcome the inherent instability of large macrocycles, are crucial for spontaneous formation of newer complex foldamers.
Fuel-driven transient DNA strand displacement circuitry with self-resetting function
The authors present an enzyme-driven mechanism to allow continuous cycling of nucleic acid strand displacement circuits. The basic idea is to have strands that are uncompetitive on their own at displacing an output from a complex, but which can be ligated to a helper duplex which in turn can be ligated to the complex, allowing displacement to proceed. The ligation is performed by specific ligase enzymes, and consumes ATP. Subsequently, restriction enzymes can cut the strands, allowing the system to revert back to its initial condition. In principle, such a system could maintain a dynamic steady state of constantly cycled outputs, but the evidence for that in this manuscript is limited. Rather, transient spikes are observed in response to a signal, since there is apparently not enough ATP to sustain the output for a long period. An interesting question to ask is whether the scheme presented can be generalised to a large strand displacement network.
Feedback regulation of crystal growth by buffering monomer concentration
Many reactions, like crystallisation, need to operate at a very specific reagent concentration regime. However, even if the concentration requisites are met, as the reaction proceeds, the reagents get consumed and the reaction regime will change. The authors propose a method for maintaining constant reagent concentration by using buffering species. The mechanism consists of a pool of inactive DNA bricks that are in equilibrium with its active form thanks to a toehold exchange reaction. This mechanism is then used to grow a population of DNA nanotubes of regular sizes. When the active bricks are consumed, the equilibrium is displaced to the formation of new active bricks. However, the buffering power of this method is still limited, and the desired concentration can only be maintained for a few hours. Further increments of the buffering species concentration would block the reaction sites of the active monomers, hindering nanotube formation.