Wednesday 19 October 2022

Novel techniques in synthetic biology: ultrasound, CRISPR and more

 Ultrasensitive ultrasound imaging of gene expression with signal unmixing

Acoustic reporter genes (ARGs) that encode air-filled gas vesicles enable ultrasound-based imaging of gene expression in genetically modified bacteria and mammalian cells, facilitating the study of cellular function in deep tissues. Despite the promise of this technology for biological research and potential clinical applications, the sensitivity with which ARG-expressing cells can be visualized is currently limited. They present BURST, a method that improves the cellular detection limit by more than 1,000-fold compared to conventional methods. BURST takes advantage of the unique temporal signal pattern produced by gas vesicles as they collapse under acoustic pressure above a threshold defined by the ARG. BURST can detect ultrasound signals from individual bacteria and mammalian cells, enabling quantitative single-cell imaging.

Dynamic modulation of enzyme activity by synthetic CRISPR-Cas6 endonucleases

RNA-scaffolds can increase flux through a given metabolic pathway by keeping the pathway’s enzymes in close proximity to increase the local concentration of pathway intermediates. Here, RNA scaffolds are built using the crRNA-Cas6 system: enzymes of interest are fused to Cas6, which in turn binds a specific loop of RNA. By engineering complementarity into the RNA sequence upstream of the Cas6-bound loop, the RNA scaffold assembles by RNA:RNA hybridisation. The authors can then use other input RNA strands to trigger the assembly and disassembly of the scaffold through TSMD reactions. With this method, the authors demonstrate controllable scaffold assembly, break-down and cycles of assembly/disassembly in vivo.

Dynamic modulation of enzyme activity by synthetic CRISPR–Cas6 endonucleases | Nature Chemical Biology

A domain-level DNA strand displacement reaction enumerator allowing arbitrary non-pseudoknotted secondary structures

Domain-level DNA strand displacement crn simulators. I've read up on visualDSD and peppercorn enumerator. The aim of these computational models is to 1) enumerate WC bonded domain level complexes which could from when designing DNA strand displacement circuits, 2) to use approximate the kinetics of the systems and 3) simulate them so that mechanisms can be inspected and designs can be updated to yield desired results, perhaps by inspection or by algorithmic approaches. Classic visual DSD ( could describe only a very limited set of toehold mediated strand displacement and other binding mechanisms. More recently visualDSD has been converted to use "logic dsd" semantics ( which can describe a very wide range of customizable reactions, including DNA strand displacements and enzymatic reactions such as ligation and cleaving. It is unclear where to find and how to use the logicDSD version, as the link provided in the paper no longer works. Classic visualDSD is incapable of simulating the remote toehold used in handhold mediated strand displacement.

Peppercorn enumeration does biomolecular binding interactions and a multitude of intramolecular reactions for non pseudoknotted secondary structured DNA complexes (where kinetics and thermodynamics are well characterised).

Approximate rates for each type of reaction based upon domain lengths can be generated within the software. Once the complexes have been enumerated and the rates of each reaction have been estimated, a CRN can be constructed. The combinatorics of enumerating all possible domain-level complexes can be challenging as the number of complexes may explode with increasing number of domains. Therefore the software has built-in coarse-graining methods based upon timescale separation. Simulation can be separated into fast and slow reactions. Unimolecular reactions may be fast, slow or negligible and bimolecular reactions are slow (which is valid for low concentrations < 10nM, uni-rates go as conc^1 and bi-rates go as conc^2). A "condensed" CRN which features fewer species can be constructed on the basis of timescale separation that has the same slow dynamics as the original CRN but can be simulated more easily. Care must be taken in cases in which sequential bimolecular reactions are required, as if all unimolecular reactions are assumed to be fast compared to biomolecular reactions, the subsequent bimolecular reaction may never occur in the condensed CRN. No net production or degradation reactions allowed in peppercorn, and therefore all strands are conserved. Toeholds less than 7 nt by default are reversible. Branch migrations are irreversible. Zero-toehold branch migrations aren't included. peppercorn would be capable of simulating hmsd, but this would require the implementation of custom reactions and custom rates.

Molecular filters for noise reduction

In classical signal processing a filter takes an input signal and produces an output signal with reduced noise. This paper investigates three classes of bimolecular chemical reaction networks (CRNs) that act as filters by producing a new output chemical that tracks the input chemical but with reduced noise. One critical difference between classical filters and CRN filters is that CRNs have intrinsic noise associated with the stochastic firing of reactions as well as noise in the input signal, where as classical filters only have noise in the input signal.

The first class of CRNs that are analysed are the linear filters, the paper shows through classical frequency domain analysis that linear filters have the same transfer function as the low-pass filter. Low-pass filters attenuate high-frequency oscillations while preserving the low-frequency ones. They show that the output signal of linear filters are limited by the Poisson’s level, a lower bound on the variance of your output signal that is at minimum the mean of the output signal.

They then go onto to investigate the annihilation module which includes complex formation reactions. They show that this can reduce the noise of the output to below the Poisson’s level. They then introduce the annihilation filter which is similar to the annihilation module except that the mean of the output signal is proportional to the mean of the input signal.

Finally they suggest that the translation and transcription of mRNA to produce proteins can be seen as two cascading linear filters and that certain microRNA pathways might be annihilation filters.

Molecular filters for noise reduction

The meeting of worlds: theoretical and physical understanding of biological processes

Locked nucleic acid-based DNA circuits with ultra-low leakage

This work aims to reduce leak in DNA-based strand displacement circuits by incorporating LNAs into strands. LNAs are modified nucleotides that retain base-pairing properties but bind DNA (or RNA/other LNAs) with increased stability. Here, LNAs are placed at extremities of gate complexes in strand displacement circuits, and the authors report significant leak reduction in all circuits tested, with a signal loss of ~10% compared to control DNA-only circuits. 

Locked nucleic acids based DNA circuits with ultra-low leakage | SpringerLink

Detailed Balance = Complex Balance + Cycle Balance: A Graph-Theoretic Proof for Reaction Networks and Markov Chains

This work introduces the idea of cycle balance as an easier condition to check than formal balance. A system is detailed balanced if the net flux along any edge of a process is zero. A process is complex balanced if the total flux out of any node is equal to the total flux into that node. A process if complex balanced if the net flux around any cycle is zero. Previous work showed that if a process was both formally balanced and complex balanced, then it was detailed balanced. This work now defines a process to be cycle balanced if for any cycle, there is both an edges faster in the anti-clockwise direction and an edges faster in the clockwise direction. They then show that a system which is complex balanced and cycle balanced is detailed balanced.

Physical constraints in intracellular signaling: the cost of sending a bit

Bryant and Machta analyse a number of distinct communication channels used by biological systems. They consider the scaling of the energy cost with the separation of transmitter and receiver and the size of both transmitter and receiver, for a given frequency. In general, communication beyond a certain characteristic lengthscale is prohibitively hard. This effect is particularly true for chemical signals that rely on diffusion, since it is impossible to send coherent waves via this mechanism. However, the low cost of chemical signalling makes it ideal over short distances. Neuron-like ion channels work well over longer distances, but also suffer from the inability to transmit coherent waves. Acoustic communication, which exploits coherent wave propagation, is best over large distances. Before the characteristic cut-off in distance, each mechanism has its own particular scaling of cost depending on the system parameters.

[2205.15356] Physical constraints in intracellular signaling: the cost of sending a bit (

Branching processes with resetting as a model for cell division

The paper describes modelling cell division as a process involving branching and resetting. Their model is based on 1-dimensional Brownian motion in a potential V, with the additional possibility for particles to branch and give birth to more particles, whose positions are reset relative to the position of the original particles. The model is applied to three different cell division schemes:

  1. Sizer: Cells size determines branching probability (cell size is used as the Brownian space dimension).
  2. Timer: Cell age determines branching probability (cell age is used as the Brownian space dimension).
  3. Adder: Branching probability is dependent on the added volume since birth.

For all three processes, branching reduces entropy (as it tends to make more probable states even more probable), and resetting increases entropy. They then claim that an efficiency can be measured by considering the ratio of branching entropy to resetting entropy, and explicitly calculate this value to be 0.41 for the timer cell division scheme in the infinite limit of a branching rate parameter.

An autonomously oscillating supramolecular self-replicator

This work describes the design of a network of sulphides and disulphides which autonomously forms and destroys a supramolecular assembly under the constant flow of peroxide as fuel. Disulphides with aromatic head and long aliphatic tail forms micellar structures with can encapsulate free thiols inside it. After a certain point, the micelle ruptures releasing the thiols back into the reaction medium. The combination of such molecular and supramolecular events produces sustained oscillations in the concentration of the components.

An autonomously oscillating supramolecular self-replicator | Nature Chemistry

Thursday 5 May 2022

Exploiting and understanding cellular molecules: nucleic acids and proteins

Strategies for Constructing and Operating DNA Origami Linear Actuators

The authors discuss the protocol optimisation for the fabrication of a DNA origami rotaxane. The objective is to find the protocol that produces the highest yield of working rail/sliders systems to use as linear actuators on the nanometric scale. The use of these sliders, when combined, will allow the fabrication of materials with subnanometer precision using the slider as a “printing head”.

A tractable genotype–phenotype map modelling the self-assembly of protein quaternary structure

The polyomino model is introduced as a high-level model of assembly of protein sub-domains into larger complexes. The paper introduces the fundamental features of the polyomino model, starting with the genotype to the formation of individual assembly kits and finally the formation of complete structures from assembly kits. The paper investigates polyominos within the wider context of genotype-phenotype maps, with regards to genotype redundancy, phenotype bias, component disconnectivity, shape space covering, as well as phenotypic robustness and its relationship to evolvability, and finds that these are quite similar to the RNA folding GP map. From a GP map perspective, this raises the question of whether these traits are inherent to self-assembling systems. Eventually, the polyomino model could yield insights on artificial systems like DNA tiles.

Paper-based microfluidics for DNA diagnostics of malaria in low resource underserved rural communities

The researchers create a paper based lateral flow devices based on Loop-mediated isothermal amplification (LAMP) of DNA. Although, PCR-based amplification assays remain the gold-standard NAAT, the requirement for trained staff and external power has limited their application in areas with reduced resources. LAMP has recently emerged as easy-to-use alternatives to PCR, owing to greatly simplified hardware requirements.

The paper discusses use of paper origami techniques to prepare blood sample preparation (including magnetic beads on DNA molecules of interest), followed by the LAMP process in a small microfluidic chamber. A hand pressed button initiates lateral flow of the amplified DNA that travels along a small membrane where anti-FITC antibodies and immobilized streptavidin are present as test and control lines. Upon successful attachment of species-specific ligands to anti-FITC antibodies, a positive signal is generated thereby enabling detection of diseases.

The beauty of nucleic acids and the scope of their application

Continuous Cell-Free Replication and Evolution of Artificial Genomic DNA in a Compartmentalized Gene Expression System

In this study, they coupled DNA replication with gene expression in cell-free system. They performed the experiments in water-in-oil droplets in serial dilution cycles. Circular DNA is replicated through rolling-circle replication followed by homologous recombination catalysed by the proteins, phi29 DNA polymerase, and Cre recombinase expressed from the DNA. Isolated circular DNAs accumulated several common mutations that exhibited higher replication abilities than the original DNA due to its improved ability as a replication template, increased polymerase activity, and a reduced inhibitory effect of polymerization by the recombinase.

Fuel-Driven Dynamic Combinatorial Libraries 

The authors analyse the fuel-driven oligomerisation of isophthalic acid. They determine that while oligomer formation is mainly driven by fuel activation, its relaxation back to equilibrium (isophthalic acid monomers) is not symmetrical. Instead of hydrolysing, the relaxation is produced by the "reshuffling" of the longest oligomers with shorter ones to produce average length oligomers. They also demonstrate that oligomers longer than 3 units can produce some sort of feedback interaction, creating insoluble complexes that resist better the relaxation to equilibrium. Of course, they also have some leak reactions that produce an undesired subproduct with a constant rate. The paper presents an interesting view on a well established far-from-equilibrium assembly reaction as the oligomerisation of isophthalic acid. However, the control over the oligomerisation process is not impressive; the concentration of oligomers decreases exponentially with length.

A Comparison of Genotype-Phenotype Maps for RNA and Proteins

This paper attempts to identify differences and similarities in the RNA and HP-lattice Protein GP maps. To ensure appropriate comparison, the RNA GP Map has only 2 alphabets. Similarities include the tendency for some simple phenotypes to be highly overrepresented in genotype space. One interesting difference is that whereas most sequences in the RNA GP-Map tend to fold to a unique structure, only a small subset of sequences in the HP GP-Map do so. The average size of genotype sets are much smaller in the HP GP Map, and it takes more mutations from a given sequence to cover the whole phenotype space than in the RNA GP Map.

Exploiting cellular machinery for novel applications

Four different mechanisms for switching cell polarity

Cell polarity (asymmetric concentration profiles within the cell) plays a role in migration, division, differentiation, development and signalling. The mechanisms by which polarity is created and maintained is understood, but the dynamics of polarity are less well studied. Here they study a model in which the concentration profile of three interacting molecular species, a polarization marker, an antagonist, and a recruiter, change in response to signals of varying strength and duration. The signalling species either promote or suppress the rate constant for one reaction within the simple reaction network. This leads to altered phase space stability of the system in the presence or absence of a signal. Through phase space stability analysis and simulation, the authors exhaustively identify four distinct ways polarity can switch in response to a signal which could be tested in future experimental studies.

Recovery of Information Stored in Modified DNA with an Evolved Polymerase

DNA is used for digital information storage, but the potential information loss from degradation and associated issues with error during reading challenge its wide-scale implementation. To address this, the authors propose using degradation-resistant analogues of natural nucleic acids (xNAs) and they used direction evolution to create a polymerase capable of transforming 2’-O-methyl templates into double-stranded DNA with a fully functional proofreading domain to correct mismatches on DNA, RNA and 2’-O-methyl templates. In addition, they implemented a downstream analysis strategy that accommodates deletions to enable the large-scale use of nucleic acids for information storage.

Stretching of a fractal polymer around a disc reveals KPZ-like statistics

This paper aims to study the directed polymer model around a curved surface. This then has implications in biology for example wrapping DNA up into chromosomes as well as other situations where polymers are wrapped up around rods or similar. They use various scaling techniques to analyse the model around a surface with local radius of curvature R, where the two ends of the polymer are fixed a distance S apart. The key observations of this paper are that the typical distance the polymer goes away from the surface, Δ, scales as R^(1/3) for small radius of curvature and scales as S^α for large radius, with a cross over radius which scales as R^z. This is the same behaviour as surface roughness models mapping Δ to the roughness, R to time and S to the interface size. Further, they note that in a certain limit the exponents tend exactly to the 1+1D KPZ exponents.

Cooperative Branch Migration: A Mechanism for Flexible Control of DNA Strand Displacement

They basically demonstrate that if you have a strand that can sequester a displaced domain once it detaches, the reaction will proceed even if it was initially not favoured AG>0. They apply this to increase the rate of strand displacement reactions producing a bulge or a mismatch.

Wednesday 4 May 2022

What are the odds?

Exact face-landing probabilities for bouncing objects: Edge probability in the coin toss and the three-sided die problem

The paper revisits the classical physics problem of what is the probability a thick coin lands on its side. They study the mechanics of a cylinder of a given thickness and radius, being given an initial random angular velocity and linear velocity. The cylinder is then allowed to bounce inelastically until it comes to rest either on one of the faces or on its edge. They then use the areas of phase space which correspond to each of the resting configurations in order to compute the respective probabilities as a function of the thickness to diameter ratio. They find that for example a £1 coin has a probability of landing on its edge of ~1/1000. Comparing to experimental and simulated data they find decent agreement. Further, they calculate the thickness to diameter ratio which would provide a 1/3 probability of landing on the edge. They calculate this to be ~0.831 which is much closer to experimental and numeric studies than previous theoretical suggestions.

Hamiltonian memory: An erasable classical bit

The authors consider a model of an information-carrying system in which the information is carried in the phase of a particle moving around a ring. They show that a (magnetic) Hamiltonian can be used to compress a uniform phase distribution to a highly-peaked one, apparently at the cost of no work input. It is unclear to me why this doesn't violate the second law - is this density in phase angle not exploitable as a non-equilibrium store of work? If not, why not?

A coarse-grained biophysical model of sequence evolution and the population size dependence

They present a coarse grained model of sequence evolution to ask questions about the speciation rate and how it differs due to effective population size. They rely on a framework analogous to thermodynamics, where the probability of a phenotype is dependent on a balance between its true fitness and the entropy of the phenotype. Using a DNA-protein binding co-evolving system as a framework, they show that, for smaller populations, the most likely phenotype is closer to inviability than for larger populations due to the greater entropic contribution in the former, and hence speciation is faster for smaller populations. This is consistent with experimental evidence, although theirs was a first attempt to explain this occurrence theoretically.

Monday 29 November 2021

DNA in self-assembly, chemical reaction networks and more

DNA as a universal substrate for chemical kinetics

This paper discusses development of control circuitry within a chemical system to direct molecular events using strand displacement reactions. The authors show basic methods to construct unimolecular and bimolecular reactions (along with a short kinetic analysis associated with each reaction system). these 2 reaction types can then be used to construct any complex CRN (chemical reaction network). They show this by developing DNA reactions that recreate a Lotka-Volterra chemical oscillator, a limit cycle oscillator, a chaotic system, and a 2-bit pulse counter.

Undesired usage and the robust self-assembly of heterogeneous structures

This work introduces a formal description of the “principle of undesired usage”. This principle states that the yield of assembling a structure is not determined by ensuring a perfect stoichiometry between its components but by tuning the reagents chemical potentials, e.g. concentrations, to avoid undesired structures. They demonstrate this principle across several types of assembly processes, with several different modelling techniques.

SAT-assembly: A new approach for designing self-assembling systems

This paper presents a method of identifying patchy particle assembly components for a given structure. The foundation of the method is based on SAT, a well-known problem in computer science. The SAT problem consists of finding boolean values that solve a given set of boolean equations with a fixed number of variables. The paper goes into great detail on the variables and clauses that characterize patchy particle assembly as SAT problems. The method is performed on a cubic diamond lattice, and the resulting assembly kit is tested using an OxDNA simulation, which found that the correct structure was indeed formed.

Local time of random walks on graphs

This paper looks at finding expressions for averages of functions of the local time to be in a given state in a discrete state discrete time Markov process. The local time for a state is the number of times that state is visited in a given time window. The approach taken by the authors here is inspired by path integration techniques from quantum physics. The paper provides a method for finding the z-transform for the average of a given function of the local time. The z-transform is similar to a generating function for the averages of the function of local time in a time window n. Specifically, the average of the function of local time up to time n will be the coefficient of z^-n of the z transform expanded in powers of 1/z. This then does have the limitation, that finding the desired average given the z-transform can be a lot of work. However, overall it was nice to see a more interesting way for finding these functions of local time.

Imaging RNA polymerase III transcription using a photostable RNA-fluorophore complex.

Quantitative measurement of transcription rates in live cells is important for revealing mechanisms of transcriptional regulation. RNA Pol III is particulary challenging as this RNAP transcribes RNA molecules so it is not possible to use protein reporters. To address this issue, this group developed Corn RNA fluorescent aptamer that resembles the fluorophore found in red fluorescent protein. With this new tool, the authors were able to study and imaging the corn-tagged Pol III transcript levels.

Dissipation bounds the amplification of transition rates far from equilibrium

Kuznets-Speck and Limmer seek to demonstrate an idea that has long been gnawing at people working on the physics of computation. A system with two metastable states is capable of acting a bit. The lifetime of those metastable states determines how long the bit can reliably store information. Another related timescale is the time it takes to switch the bit, when such a switch is required. There is a general feeling that if you want both a long reliability time, and a short switching time, this should be costly (in terms of the energy you have to put in). However, such a tradeoff has not been found, in general, using the tools of modern stochastic thermodynamics. The title of this manuscript suggests that they have been able to identify a hard tradeoff; however, this tradeoff only appears when certain conditions are met. The authors argue that these conditions are quite general, but it is still unclear whether there is a limit on designing a reliable bit that can be switched quickly at low thermodynamic cost.