While exploring ways of making nanotechnology affordable and easier to use, occasionally we come across a particularly innovative company. That's the case with with the Emerald Cloud Lab (ECL). I got a chance to talk with ECL's Head of Business Development and Strategy, Toby Blackburn.
Toby, tell us a little about you and your career.
I got my undergraduate degree in Chemical Engineering at North Carolina State University and started my career at Biogen in cell culture, working on scaling processes from bench scale to manufacturing scale. I spent a lot of time in the lab during those years, but also a lot of time thinking about how to streamline and automate operations, and began to notice the impact that path dependency and corporate structures had on science. I eventually went back to school and received my MBA from Duke, moving into an Analytical Development function where I managed a large CRO budget and a team of analytical scientists. These transitions gave me the tools and opportunity to start implementing some of the ideas about how the execution of lab work could be better. After implementing significant improvements in lab execution, it was clear that there was an opportunity to further improve, but this would require a complete reimagining of the lab itself.
What drew you to ECL?
When I first visited our facility in South San Francisco, it was immediately clear that DJ Kleinbaum and Brian Frezza, ECL’s co-founders, had built something that addressed a lot of the laboratory execution problems I was seeing in industry. They had clearly rethought this lab from the ground up, considering all of the tools and technology available today, and came to the conclusion that a lab could operate completely remotely, and that there were inherent benefits in doing so. I jumped into the details, analyzing the capabilities of the ECL while considering the needs of a large biotech, and could not find any critical flaw in how the ECL could meaningfully improve enterprise level research and development.
On its website, ECL states that its mission is to “empower scientists to transcend the laboratory.” What does that mean in practical terms? Do you imagine a future where most lab work is done remotely?
I see a significant deviation between people’s perception of what a scientist does and reality. The common perception is that scientists spend their time thinking and designing experiments that result in very clear data. As most people who have spent time in the lab know, this simply isn’t the case. You spend a lot of time on the logistics of your experiments — ordering your materials and reagents, mixing stock solutions, scheduling time on an instrument, troubleshooting, finding samples, labeling tubes, servicing the instruments, sourcing new instrumentation, etc. — before you even get to your data. The wonderful byproduct of working in the ECL is that there is a clear separation between experimental analysis / design and experimental execution, returning all the time spent on lab logistics to scientists and enabling them to manage multiple experiments and projects simultaneously.
Suppose I want to run an experiment using ECL. From experimental design through data acquisition and analysis, walk us through a typical engagement. How much is automated and how much requires human hands?
You start with training, a guided tutorial conducted by our Scientific Developers, who are PhD level scientists whom we have taught how to program. The training sessions, which take a couple of days to complete, give you the basic skills needed to access and operate any of the 150+ instruments in the ECL.
From here, you’re ready to design your first experiment, which is made easier by our experiment builder tool, which allows you to point and click through sample selection and set parameters as if you were in front of the instrument. This helps bridge the gap between your scientific expertise and the programmable, scriptable language, called Symbolic Lab Language (SLL), that directly drives the execution of experiments. Once your experiment is set up, you press go and ECL takes over the execution.
Our entire lab is managed by software called ECL Engine. Engine can be best thought of as a lab traffic controller, managing all of the resources required to run your experiment and their physical flow through the lab. As far as automation, the ECL can handle readily automatable experiments, like using liquid handlers for a spectroscopy assay, as well as what would traditionally be difficult to automate, like pipetting or moving 2L bottles around the lab by using operators. We implement automation where it makes sense, and, in cases where there are redundant automated and manual functions, the scientist has complete control.
Once your experiment is completed, the data and metadata are collected in ECL Constellation. Constellation is a network of linked database objects that structures the data you generate into a highly organized knowledge graph, growing automatically over time as you run experiments. You can answer any questions you have about your experiments in seconds by surfing through your knowledge graph with a few clicks or keystrokes, or conduct searches across the full history of experiments run on the system — including your own data, any data shared within your organization, and any data published on the system. All of this data lives in the cloud, so you’ll never worry about sifting through loose files, emails, or thumb drives — your data is accessible from any computer with a secure login.
ECL Command Center provides over 4,500 powerful functions for data visualization, analysis, and simulation. The software also allows your experiments, data, analysis, results, and even scientific figures to be exported, shared, or published on the web. These tools can be accessed through a point-and-click interface or the commands can be directly entered into your lab notebook. This makes it easy to repeat or scale any analysis with a single command and to automate report generation through higher-level scripting.
What types of experiments and equipment is ECL currently equipped to run? What are you planning to add in the near future?
Our list of capabilities is rapidly growing as we bring on new instruments, so the most up to date list can be found here.
What are the benefits and drawbacks of using ECL as opposed to owning your own lab equipment?
From a purely lab execution standpoint, ignoring the enhanced value of the data network generated by ECL, some of the main benefits are flexibility, uptime, and integrated equipment maintenance. The diversity of equipment can provide scientists with options to work around an unplanned limitation of a particular technique. Our lab runs 24/7/365, so your experiments continue moving forward even when you’re not working. Lastly, our instruments run routine controls to ensure everything is working correctly, and equipment maintenance and troubleshooting are entirely handled by our team of experts, ensuring confidence in the output of your data without having to manage any of the logistics.
Because of COVID, much of the business world was forced to go remote, but that’s obviously difficult for most lab scientists. Have you seen an uptick in ECL usage since the pandemic began?
Companies are certainly re-evaluating their assumptions on how to manage business continuity for lab based employees as a result of COVID, which is an obvious value proposition of the ECL, and this re-evaluation is opening the door for people to see the additional value that the ECL can provide. For example, as we talk with enterprise customers, they are also interested in the ability of the ECL to be a central source of truth for their scientists who are spread across corporate locations, time zones, and countries. Startups are asking themselves whether they really need to build a lab, or even have offices, to get their companies going.
ECL is primarily focused on research and industry. Will there come a point when high schools and middle schools will be able to run labs remotely on equipment they can’t afford today?
I think that as ECL grows, and is able to take advantage of economies of scale, we will start to see opportunities to provide access to world class scientific instrumentation to anyone.
We have recently completed a pilot class at a large university, where students learned to operate Command Center and ran all of their experiments in the ECL, so that vision might not be far off in the future.
For the most part, humans are still doing the experimental design. Given how fast machine learning is progressing, is it realistic to expect that within the next 20-ish years, we could have something like robotic process automation (RPA) running a lab experiment from start to finish?
We have people working on the system today that are doing just that. Beyond fully running an experiment remotely, Command Center also contains the tools to automate your data analysis, script multiple experiments together, and program decision nodes about which experiment to run next, based on prior results. A straightforward example is using ECL to script column screening for novel compounds — you might not know the starting compound, but could programmatically step through a sequence of separations to identify what column chemistry and conditions work best.
This ability allows scientists to move up a layer of abstraction and manage entire workflows rather than managing each individual experiment. I expect that as the knowledge graph continues to grow, scientists will increasingly find ways to impart their decision making models into their experimental protocols, and continue to focus on the higher level work to direct where the science needs to go.
Last question: If you had to hire a nanobot to do a job for you, what job would you hire it to do?
Without a doubt, toothbrushing.
Toby, thank you so much for taking the time to talk with us today! If you'd like to learn more about the ECL, go to their website at www.emeraldcloudlab.com.