Abstract

Automation is becoming ubiquitous in all laboratory activities, moving towards precisely defined and codified laboratory protocols. However, the integration between laboratory protocols and mathematical models is still lacking. Models describe physical processes, while protocols define the steps carried out during an experiment: neither cover the domain of the other, although they both attempt to characterize the same phenomena. We should ideally start from an integrated description of both the model and the steps carried out to test it, to concurrently analyze uncertainties in model parameters, equipment tolerances, and data collection. To this end, we present a language to model and optimize experimental biochemical protocols that facilitates such an integrated description, and that can be combined with experimental data. We provide probabilistic semantics for our language in terms of Gaussian processes (GPs) based on the linear noise approximation (LNA) that formally characterizes the uncertainties in the data collection, the underlying model, and the protocol operations. In a set of case studies, we illustrate how the resulting framework allows for automated analysis and optimization of experimental protocols, including Gibson assembly protocols.

Highlights

  • Automation is becoming ubiquitous in all laboratory activities: protocols are run under reproducible and auditable software control, data are collected by high-throughput machinery, experiments are automatically analyzed, and further experiments are selected to maximize knowledge acquisition

  • We consider the following model of Gibson assembly described by Michaelis–Menten kinetics, where two DNA double strands, AB

  • We have introduced a probabilistic framework that rigorously describes the joint handling of liquid manipulation steps and chemical kinetics, throughout the execution of an experimental protocol, with particular attention to the propagation of uncertainty and the optimization of the protocol parameters

Read more

Summary

Introduction

Automation is becoming ubiquitous in all laboratory activities: protocols are run under reproducible and auditable software control, data are collected by high-throughput machinery, experiments are automatically analyzed, and further experiments are selected to maximize knowledge acquisition. It is often hard to attribute causes of experimental failures: whether an experiment failed because of a misconception in the model, or because of a misstep in the protocol To confront this problem, we need an approach that integrates and accounts for all the components, theoretical and practical, of a laboratory experiment. We should ideally start from an integrated description from which we can extract both the model of a phenomenon, for possibly automated mathematical analysis, and the steps carried out to test it, for automated execution by lab equipment This is essential to enable automated model synthesis and falsification by concurrently taking into account uncertainties in model parameters, equipment tolerances, and data collection

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call