If you tuned into this week’s Monday Live session, you were treated to a masterclass in navigating one of the smart buildings industry’s most exciting, yet often misunderstood, concepts: the Digital Twin.
The conversation, sparked by contributions from Doug, Zara, Kimon, and others, cut through the Hollywood hype to address a critical, practical question: If digital twins are so powerful, why aren’t they everywhere?
The answer, it turns out, isn’t a lack of technology, but a challenge of perception, process, and a clear starting point.
The “Hollywood” Problem: Nebulous Definitions and Sky-High Expectations
As the panel noted, a significant barrier to adoption is the “nebulous” definition of a digital twin. Many imagine a perfect, real-time 3D replica of a building—a “Hollywood version” that seems complex and costly to achieve. Read https://www.automatedbuildings.com/2025/03/and-the-oscar-goes-to-hollywood-bim-digital-twins/
However, Zara from NIPs reframed this brilliantly, highlighting that a digital twin isn’t a single, monolithic entity. It’s a spectrum:
- A Digital Shadow for monitoring energy or equipment.
- A complete Cyber-Physical System for closed-loop control and automation.
You don’t need a billion-dollar model to start. The minimum viable digital twin can be as simple as a door with a sensor. The key is starting with a specific use case that delivers immediate value.
The “Shift Left” Opportunity: Finding ROI in the Design Phase
A powerful insight from the discussion was the concept of “shifting left.” The ROI of a digital twin shouldn’t be calculated solely for the operational phase of a building.
By starting as a “Design Twin” during the architectural and engineering stages, the digital model begins paying for itself immediately. It enables simulation, optimization, and clash detection long before construction begins, proving its value and building a foundation that seamlessly transitions into operations. This eliminates the perception of a massive, upfront cost at the end of a project.
The Path to Scale: Division 25 and the Power of Open Standards
So, where do we anchor this capability? The consensus pointed to a clear solution: MasterFormat Division 25 – Integrated Automation.
Division 25 is the natural home for the digital twin’s “common information model.” Instead of each project reinventing the wheel, we can define a foundational, open standard within Division 25 that all other building systems—from HVAC (Division 23) to Electrical (Division 26)—must reference and comply with.
This is where the work of organizations like the Linux Foundation becomes critical. Open, vendor-neutral standards are the only way to break down proprietary silos and create the interoperable, plug-and-play ecosystem that makes digital twins scalable and affordable. As the panel stressed, the onus for mapping to this standard model should be pushed onto OEMs and system providers, making their products inherently “twin-ready.”
The Owner’s Role: It All Comes Down to the Spec
The entire conversation culminated in a clear call to action: The owner must demand it.
The technology exists. The standards are being developed. The missing link is often the owner’s requirement in the BIM Execution Plan and project specifications. As one panelist put it, if the data isn’t delivered in the right format, “you don’t get paid.” By making digital twin deliverables a contractual requirement, owners can finally unlock the data needed to make smarter decisions across their portfolio.
Why Stacking Standards Won’t Create Your Digital Twin
In the race to build smarter, our industry has accumulated a tower of standards: IFC for geometry, COBie for handover, OmniClass for classification, and MasterFormat for procurement. We followed the rules, yet we’re left with a painful reality:
- We have many standards.
- But no unified, interoperable framework.
- So every project becomes a bespoke integration challenge.
This isn’t just frustrating—it actively undermines the systemic interoperability, composability, and lifecycle continuity required for capability-based Digital Twins at scale.
The Bottleneck in Plain Sight
The fundamental issue is that our legacy standards were created for different purposes, at different times, by different bodies. They are not natively interoperable. Mapping between them is a manual, tool-dependent process. The result? You get structured data, but not shared meaning. You get documents, but not live, composable capabilities.
For a true Digital Twin, you need machines to understand what components do, not just what they are.
The Atomic Solution: Capability-Based Modeling
The path forward is to simplify. Instead of modeling complex, monolithic systems, we must define atomic, capability-based components—sensors, actuators, and controllers—with core, reusable attributes.
- A Sensor capability model defines
value,unit, andaccuracy. - An Actuator capability model defines a
setValuecommand. - A Controller capability model defines logic and relationships.
Specialized equipment, like a CO₂ sensor or a VAV box controller, simply extends these core capabilities. This creates a “Lego-like” system in which interoperability is based on what a component can do, not on its proprietary type.
The following chart illustrates how this approach creates a scalable, interoperable foundation, in contrast to the fragmented status quo.
A Realistic Path Forward: Ownership and Execution
This new approach demands a shift in responsibility.
- Who Defines the Models? A federated, industry-wide effort is ideal, but projects can start today by adopting open, semantic ontologies like Brick Schema or Microsoft’s DTDL (Digital Twins Definition Language). These provide the “grammar” for defining capabilities.
- Who Executes? The Architectural and Engineering (A/E) teams must be responsible for defining the structure of the Digital Twin—the core capability models and relationships. They create the template.
- Who Provides the Data? Subcontractors and manufacturers are then tasked with populating these structured templates with real-world data (serial numbers, setpoints, sensor IDs). They fill in the blanks.
Where to Specify: The Division 25 Nexus
The logical home for these requirements is MasterFormat Division 25 – Integrated Automation. This division is designed for system integration and is the natural place to specify:
- The common information and capability model.
- The digital twin platform interface.
- Data handover protocols.
Crucially, every technical division (23-HVAC, 26-Electrical, etc.) must then reference and comply with Division 25’s digital twin specifications, ensuring all delivered systems are inherently “twin-ready.”
The Bottom Line
We cannot specify, procure, or integrate our way to a Digital Twin using a stack of fragmented documents. The future lies in shifting from document-based specs to capability-based, ontology-driven models built around a semantic integration layer.
By starting atomic, defining clear ownership, and anchoring specifications in Division 25, we can finally escape the system integration treadmill and build infrastructure that is as modular, intelligent, and adaptive as the systems it contains.
Want to be part of the conversation? Join us every Monday at MondayLive.org to help shape the future of smarter buildings.