A cloud laboratory is a heavily automated, centralized research laboratory where scientists can run an experiment from a computer in a remote location. [1] [2] [3] Cloud laboratories offer the execution of life science research experiments under a cloud computing service model, allowing researchers to retain full control over experimental design. [4] [5] Users create experimental protocols through a high-level API and the experiment is executed in the cloud laboratory, with no need for the user to be involved. [1] [5]
Cloud labs reduce variability in experimental execution, as the code can be interrogated, analyzed, and executed repeatedly. [2] They democratize access to expensive laboratory equipment while standardizing experimental execution, which could potentially help address the replication crisis [4] [6] [7] —what might before have been described in a paper as "mix the samples" is replaced by instructions for a specified machine to mix at a specified rpm rate for a specified time, with relevant factors such as the ambient temperature logged. [8] They also reduce costs by sharing capital costs across many users, by running experiments in parallel, and reducing instrument downtime. [7] Finally, they facilitate collaboration by making it easier to share protocols, data, and data processing methods through the cloud. [6]
Cloud labs utilize common scientific techniques including DNA sequencing and genotyping, high-performance liquid chromatography (HPLC), protein extraction, plate reading, upstream bioprocessing, and western blotting. [3] [9] [10] [11] Users begin by signing up and logging in to the web-based software interface. [5] Researchers submit their protocols via a dedicated web application or through an API, and when the order arrives at the laboratory, human operators set up the experiment and transfer plates from machine to machine. Data is automatically uploaded to the cloud lab via an API where users can access and analyze it. Users can review controls, machine settings, and reagents used. [10] Multiple experiments can be run in parallel, 24 hours a day. [9] [12] [13]
A true cloud lab is defined by five criteria: [14] [15]
High-throughput experimentation involves increasing throughput by scaling up the number of experiments that can be run in parallel using a common sample form factor and technique. [16] [17] When space or materials are limited, minor factors must be assigned to progressively smaller fractions to increase the number of replicates. [18] Cloud labs, on the other hand, do not fundamentally scale up a single experiment but rather increase the number of types of experiments that can be run in parallel. [19] For example, with a cloud lab, a scientist could simultaneously attempt dozens of different purification methods that each uses completely unique equipment sets. [15]
HTE work cells can sometimes be accessed remotely to trigger a run on a library or digitally monitor a run. However, this remote monitoring or screen triggering does not impact the development that must take place in advance of a run. [16] Often with HTE, scientists must group samples into libraries that use the same or very similar form factor containers such that the work cell can more easily traffic and address each sample in an integrated manner. [16] Therefore, scientists need to standardize sample form factors of samples and handle the sample prep offline of the work cell. Cloud labs can work with samples in hundreds or even thousands of unique containers, providing additional flexibility relative to traditional labs (even those that are using HTE), and allowing processing of a larger number of samples. [15]
Cloud labs are intended to replace the driver of traditional lab work by offering scientists the capability to conduct the same type of work they would typically perform in a traditional lab, except unrestricted by time and laboratory space. [20] [21]
Cloud laboratories were built on advancements made in laboratory automation in the 1990s. In the early 1990s, the modularity project of the Consortium of Automated Analytical Laboratory Systems worked to define standards by which biotechnology manufacturers could produce products that could be integrated into automated systems. [22] In 1996, the National Committee for Clinical Laboratory Standards (now the Clinical and Laboratory Standards Institute) proposed laboratory automation standards that aimed to enable consumers of laboratory technology to purchase hardware and software from different vendors and connect them to each other seamlessly. [23] The committee launched five subcommittees in 1997 and released standardization protocols to guide product development through the early 2000s. [24] [25]
These early developments in interoperability led to early examples of lab automation using cloud infrastructure, such as the Robot Scientist "Adam" in 2009. This robot encapsulated and connected all the laboratory equipment necessary to perform microbial batch experiments. [26]
In 2010, D. J. Kleinbaum and Brian Frezza founded antiviral developer Emerald Therapeutics. To simplify laboratory testing, the group wrote centralized management software for their collection of scientific instruments and a database to store all metadata and results. [27] [3]
In 2012, Transcriptic founded a robotic cloud laboratory for on-demand scientific research, which performed select tasks including DNA cloning remotely. [28]
In 2014, Emerald Therapeutics spun out the Emerald Cloud Lab to fully replace the need for a traditional lab environment, enabling scientists from around the world to perform all necessary activities, from experimental design to data acquisition and analysis. [29]
Carnegie Mellon University's Mellon College of Science is building the world's first academic cloud laboratory on their campus. [30] The 20,000 square foot laboratory will be completed in 2023 and offer access to CMU researchers and eventually to other schools and life-sciences startups in Pittsburgh. [31] [3]
Easy access to sophisticated labs can be a potential biosecurity or bioterrorism threat. Filippa Lentzos, an expert in biological risk and biosecurity, said "there are some pretty crazy people out there ... Barriers are coming down if you want to deliberately do something harmful". Cloud labs say that they review all scheduled experiments and can flag or reject any that appear illegal or dangerous, and that detailed record-keeping makes monitoring what is done easier than in a traditional laboratory. [8]
A laboratory is a facility that provides controlled conditions in which scientific or technological research, experiments, and measurement may be performed. Laboratories are found in a variety of settings such as schools, universities, privately owned research institutions, corporate research and testing facilities, government regulatory and forensic investigation centers, physicians' offices, clinics, hospitals, regional and national referral centers, and even occasionally personal residences.
Laboratory robotics is the act of using robots in biology, chemistry or engineering labs. For example, pharmaceutical companies employ robots to move biological or chemical samples around to synthesize novel chemical entities or to test pharmaceutical value of existing chemical matter. Advanced laboratory robotics can be used to completely automate the process of science, as in the Robot Scientist project.
High-throughput screening (HTS) is a method for scientific discovery especially used in drug discovery and relevant to the fields of biology, materials science and chemistry. Using robotics, data processing/control software, liquid handling devices, and sensitive detectors, high-throughput screening allows a researcher to quickly conduct millions of chemical, genetic, or pharmacological tests. Through this process one can quickly recognize active compounds, antibodies, or genes that modulate a particular biomolecular pathway. The results of these experiments provide starting points for drug design and for understanding the noninteraction or role of a particular location.
A liquid handling robot is used to automate workflows in life science laboratories. It is a robot that dispenses a selected quantity of reagent, samples or other liquid to a designated container.
Laboratory automation is a multi-disciplinary strategy to research, develop, optimize and capitalize on technologies in the laboratory that enable new and improved processes. Laboratory automation professionals are academic, commercial and government researchers, scientists and engineers who conduct research and develop new technologies to increase productivity, elevate experimental data quality, reduce lab process cycle times, or enable experimentation that otherwise would be impossible.
Biolab is a single-rack multi-user science payload designed for use in the Columbus laboratory of the International Space Station. Biolab support biological research on small plants, small invertebrates, microorganisms, animal cells, and tissue cultures. It includes an incubator equipped with centrifuges in which the preceding experimental subjects can be subjected to controlled levels of accelerations.
A medical laboratory or clinical laboratory is a laboratory where tests are conducted out on clinical specimens to obtain information about the health of a patient to aid in diagnosis, treatment, and prevention of disease. Clinical medical laboratories are an example of applied science, as opposed to research laboratories that focus on basic science, such as found in some academic institutions.
Open-notebook science is the practice of making the entire primary record of a research project publicly available online as it is recorded. This involves placing the personal, or laboratory, notebook of the researcher online along with all raw and processed data, and any associated material, as this material is generated. The approach may be summed up by the slogan 'no insider information'. It is the logical extreme of transparent approaches to research and explicitly includes the making available of failed, less significant, and otherwise unpublished experiments; so called 'dark data'. The practice of open notebook science, although not the norm in the academic community, has gained significant recent attention in the research and general media as part of a general trend towards more open approaches in research practice and publishing. Open notebook science can therefore be described as part of a wider open science movement that includes the advocacy and adoption of open access publication, open data, crowdsourcing data, and citizen science. It is inspired in part by the success of open-source software and draws on many of its ideas.
Robotics is an interdisciplinary field that involves the design, construction, operation, and use of robots.
Robot Scientist is a laboratory robot created and developed by a group of scientists including Ross King, Kenneth Whelan, Ffion Jones, Philip Reiser, Christopher Bryant, Stephen Muggleton, Douglas Kell, Emma Byrne and Steve Oliver.
LabKey Server is a software suite available for scientists to integrate, analyze, and share biomedical research data. The platform provides a secure data repository that allows web-based querying, reporting, and collaborating across a range of data sources. Specific scientific applications and workflows can be added on top of the basic platform and leverage a data processing pipeline.
Eterna is a browser-based "game with a purpose", developed by scientists at Carnegie Mellon University and Stanford University, that engages users to solve puzzles related to the folding of RNA molecules. The project is supported by the Bill and Melinda Gates Foundation, Stanford University, and the National Institutes of Health. Prior funders include the National Science Foundation.
Ross Donald King is a Professor of Machine Intelligence at Chalmers University of Technology.
James J. Kuffner Jr. is an American roboticist and chief executive officer (CEO) of Woven by Toyota. Dr. Kuffner is also Chief Digital Officer and a member of the Board of Directors of Toyota Motor Corporation. Kuffner continues to serve as an Adjunct Associate Professor at the Robotics Institute at Carnegie Mellon University and as Executive Advisor to Woven by Toyota. Kuffner earned a Ph.D. from the Stanford University Dept. of Computer Science Robotics Laboratory in 1999.
Cloud robotics is a field of robotics that attempts to invoke cloud technologies such as cloud computing, cloud storage, and other Internet technologies centered on the benefits of converged infrastructure and shared services for robotics. When connected to the cloud, robots can benefit from the powerful computation, storage, and communication resources of modern data center in the cloud, which can process and share information from various robots or agent. Humans can also delegate tasks to robots remotely through networks. Cloud computing technologies enable robot systems to be endowed with powerful capability whilst reducing costs through cloud technologies. Thus, it is possible to build lightweight, low-cost, smarter robots with an intelligent "brain" in the cloud. The "brain" consists of data center, knowledge base, task planners, deep learning, information processing, environment models, communication support, etc.
Automated synthesis or automatic synthesis is a set of techniques that use robotic equipment to perform chemical synthesis in an automated way. Automating processes allows for higher efficiency and product quality although automation technology can be cost-prohibitive and there are concerns regarding overdependence and job displacement. Chemical processes were automated throughout the 19th and 20th centuries, with major developments happening in the previous thirty years, as technology advanced. Tasks that are performed may include: synthesis in variety of different conditions, sample preparation, purification, and extractions. Applications of automated synthesis are found on research and industrial scales in a wide variety of fields including polymers, personal care, and radiosynthesis.
Anne E. Carpenter is an American scientist in the field of image analysis for cell biology and artificial intelligence for drug discovery. She is the co-creator of CellProfiler, open-source software for high-throughput biological image analysis, and a co-inventor of the Cell Painting assay, a method for image-based profiling. She is an Institute Scientist and Senior Director of the Imaging Platform at the Broad Institute.
Vandana "Vandi" Verma is a space roboticist and chief engineer at NASA's Jet Propulsion Laboratory, known for driving the Mars rovers, notably Curiosity and Perseverance, using software including PLEXIL programming technology that she co-wrote and developed.
Emerald Cloud Lab (ECL) is a privately-owned biotech startup. The company focuses on advancing laboratory virtualization, for chemistry and biotechnology, by building the first fully functional cloud lab, allowing scientists to conduct all of their wet lab research without being in a physical laboratory.