Lab connectivity still isn't plug-and-play. This session addresses the real challenges automation specialists face every day and the practical solutions that work.
Instrument Idiosyncrasies & Integration Roadblocks – Understand why physical-to-digital integration remains complex, from inconsistent protocols to metadata challenges, and learn how integrators can solve these problems.
Automation Principles for Manual Labs – Explore how modularity, standardization, and reproducibility don't just improve efficiency—they unlock entirely new experimental possibilities.
Creating AI-Ready Lab Infrastructure – Learn how orchestrating hybrid experiments and capturing rich metadata builds the reliable foundation AI-enabled discovery requires.
Join us for an exclusive on-demand webinar featuring:
- Russell Green, Principal Product Marketing Manager, HighRes
- Crystal McKinnon, Principal Product Manager, Cellario OS, HighRes
Moderated by Gina Anzalone
Laboratory automation has come a long way. Liquid handlers, integrated robotic systems, and complex analytical platforms have transformed how science gets done. Yet despite all this hardware, many labs still struggle with a fundamental problem: getting everything to work together. That's where orchestration comes in — and why it may be the most important concept in lab automation today.
What Is Orchestration, Really?
Orchestration is one of those terms that means different things to different people. At its core, it's the seamless coordination of instruments, data, people, and processes into a unified system, enabling whole experimental workflows to run efficiently, reliably, and traceably.
A simpler way to put it: orchestration is about bringing order to complexity. Like a conductor drawing a full orchestra into harmony, a good orchestration layer takes all of your disparate lab components and makes them perform as one.
Orchestration didn't appear overnight. It grew from three distinct needs:
- Managing large automated facilities — where multiple work cells needed a "super scheduler" sitting above individual device schedulers to coordinate activity across an entire floor.
- Running long-term complex experiments — particularly in cell culture and IPSC development, where small bursts of work alternate with long wait times, creating traceability and parallelization nightmares.
- Connecting the lab to the digital ecosystem — providing a single integration layer and API to shuttle data between instruments, ELNs, LIMS, and other informatics platforms.
Why Automation Vendors Are Best Placed to Build Orchestration Tools
It might seem like orchestration is a software or IT problem. But the argument made in this session is compelling: automation vendors are uniquely positioned to solve it, precisely because they already live at the intersection of the physical lab and the digital world.
Decades of experience integrating instruments — from simple shakers to complex imaging systems — gives automation vendors something no pure software company can replicate: a deep understanding of how labs actually work. HighRes alone has a library of over 500 device drivers, plus tools that allow customers and automation scientists to build their own.
This foundation matters because orchestration only works if it's built on strong physical-to-digital connectors. Trying to build the orchestration layer first and figure out device connectivity later is, in the words of the session's presenter, a path to failure.
The Two Biggest Challenges
1. Physical-to-Digital Connectivity
Getting physical instruments to communicate with a digital orchestration layer is genuinely hard, for several reasons:
- No universal standards. The industry has tried — and largely not succeeded — in establishing shared integration standards. Most vendors end up defining their own.
- Huge device diversity. A shaker and a high-content imaging system require fundamentally different levels of integration.
- Integration readiness varies wildly. Many instruments were never designed to be integrated, making creative problem-solving a constant necessity.
- Siloed knowledge. Written SOPs rarely capture what lab scientists actually do. The gap between what's documented and what's practiced becomes a blocker when trying to automate a workflow.
2. Changing How We Think About Experiments
Even if the technology is ready, labs won't get the most out of orchestration without changing their experimental mindset. This is where the concept of an automation-first approach becomes critical — and it's as much about culture and change management as it is about tools.
An automation-first approach means:
- Designing experiments with automation in mind from the start — asking "how would a robot run this?" before locking in a protocol.
- Standardizing and modularizing workflows — breaking protocols into discrete, reusable steps. A cell-based assay, for example, naturally divides into a cell preparation module and an assay execution module. Keeping these distinct makes automation more robust and protocols easier to adapt.
- Treating data as a first-class product — not an afterthought. Since the whole point of automation is to generate better data faster, the data flow should be designed upfront, including metadata capture and contextual annotation.
- Planning for scale and reliability — sometimes that means automating the most critical subset of a workflow in duplicate rather than building one monolithic automated system. Redundancy and resilience beat complexity.
Where Orchestration Makes a Real Difference: The DMTA Cycle
For those working in drug discovery, the Design-Make-Test-Analyze (DMTA) cycle is a perfect case study in why orchestration matters.
Even in labs that already have automation for individual steps, the handoffs between those steps are often manual — emails, spreadsheets, file transfers, informal coordination across multiple platforms. The result? Error-prone sample tracking, impossible parallelization, and severely limited throughput.
With a true orchestration layer, that picture changes dramatically. Handoffs become coached and guided. Sample tracking becomes automatic. Data flows continuously and is automatically enriched with metadata. And perhaps most importantly, parallelization becomes possible, which is where the real throughput gains come from.
The same principle applies to compound management, genomics pipelines, and any environment where fleets of instruments need to be dynamically scheduled and their outputs traced end-to-end.
What This Looks Like in Practice
A simple but illustrative example: a semi-automated workflow where a LIMS places an order, a user scans barcodes on sample plates, a dispenser runs automatically under orchestrator control, the user carries plates to a manually operated centrifuge, and the orchestrator provides step-by-step instructions for setting it up correctly, before automatically logging the completed run back to the LIMS.
This hybrid workflow — part automated, part manual, fully guided and traced — is exactly the kind of environment orchestration is designed for. It's not just about replacing humans with robots. It's about making the whole process smarter, whether hands are involved or not.
Integration With ELNs, LIMS, and Multi-Site Environments
One of the most common questions in labs considering orchestration is: How hard is it to connect to our existing informatics stack?
The answer, increasingly, is: not very. Modern orchestration platforms are designed to be the central hub for data flowing to and from ELNs, LIMS, and analytics platforms. Partnerships with vendors like Benchling, for example, mean scientists can kick off orders directly from their ELN, have them routed through the orchestrator to the appropriate instruments, and get results automatically parsed back into their data analysis pipeline — with dramatically fewer manual clicks.
Multi-site deployments are also supported, with the flexibility to run separate instances per site or consolidate into a single orchestration layer depending on operational needs.
The Road Ahead: AI, MCP, and Beyond
It wouldn't be a 2025/2026 conversation without touching on AI. Orchestration is, in many ways, the prerequisite for meaningful AI integration in the lab. Training machine learning models requires rich, well-annotated experimental data — and that kind of data is only consistently generated at scale through a fully orchestrated approach.
The vision for the future is a lab where scientists submit requests in plain language, specifying what they want to achieve rather than how to do it. The orchestration layer handles resource allocation, scheduling, and execution, returning contextualized results back to the requester. It's not fully there yet, but it's closer than many might think.
The Bottom Line
Orchestration isn't just about big integrated robotic systems. It's a mindset, a toolkit, and an organizational approach that can unlock the full value of whatever automation a lab already has — while laying the groundwork for the AI-driven science of the near future.
Interested in learning more? Book a demo!