The transition from historical building computer simulation to the modern building digital twin represents a fundamental shift in how we understand, manage, and predict the performance of built environments. It’s a move from static analysis to a dynamic, living model.
The Past: Building Computer Simulation
In the past, computer simulation was a revolutionary tool, but it had significant limitations:
- Static Analysis: Simulations were primarily conducted before construction (design stage) or periodically offline (e.g., once a year for energy modeling). They relied on fixed inputs like climate data, design specifications, and assumed operating schedules.
- Purpose: The main goal was to predict performance metrics like energy consumption, thermal loads, or daylighting under ideal or average conditions. It was great for comparing different design choices (e.g., window types or insulation levels).
- Data Source: Simulations used synthetic or aggregated data. They were not directly connected to the building’s physical operations or its occupants’ real-time behavior.
- One-Way Process: Data flowed from the designer into the model. The model produced a result, but it did not update itself based on how the actual building performed later. It was a theoretical snapshot.
- Technology: These often ran on high-powered workstations or mainframes, requiring specialized software and technical expertise (as seen in the early days of energy analysis).
The Present: The Building Digital Twin
The Digital Twin takes the conceptual foundation of simulation and combines it with connectivity, real-time data, and AI, creating a continuously evolving virtual replica:
- Dynamic, Real-Time Connection: The core distinction is the bi-directional link between the physical asset and the virtual model. The twin is continuously fed real-time operational data from the building’s sensors, BMS, metering systems, and IoT devices.
- Purpose: The goal is no longer just prediction; it’s optimization, prediction, and prescription. A Digital Twin can:
- Diagnose performance issues (Fault Detection and Diagnostics, FDD) as they happen.
- Predict future performance (e.g., equipment failure or energy peaks).
- Run simulations (like “what-if” scenarios) using current, real-world conditions to prescribe the best operational strategy.
- Data Source: The twin uses live, semantic-rich data that includes asset information (BIM), operational data (BMS/SCADA), and contextual data (weather, occupancy schedules, energy prices).
- Continuous Feedback Loop: The Digital Twin creates a closed loop. The physical building informs the model, the model optimizes control strategies, and those strategies are fed back to the physical building’s automation system.
- Technology: Twins are heavily reliant on the Cloud, AI/Machine Learning, and Open/Semantic Standards (such as Haystack and Brick) to process massive amounts of data and provide the context for intelligent analysis.
In short, the transition is from “modeling the theory” to “mirroring the reality.” The old simulation provided answers about a conceptual design; the Digital Twin provides solutions for a living, breathing operational building.
Doug Migliori, MBA
Digital Twin Consulting / Decision Intelligence / Event Driven Systems / AI-powered Airports
Ken,
I appreciate your comparison between the static simulations of the past and the dynamic, real-time simulations now enabled by digital twins. The shift toward continuous optimization — where controller actions are informed by live data and evolving conditions — is well captured. One of the digital twin’s greatest strengths lies in its ability to maintain a persistent state history of its physical counterpart and interoperate with other twins to infer future states. This predictive capability enables prescriptive decision-making based on simulated outcomes across multiple scenarios. In that context, I’d suggest reordering the phrasing in your paper to “Predict, Optimize, Prescribe” to better reflect the logical progression of twin-enabled intelligence.
Additionally, I’d offer a perspective on the Sequence of Operations (SOO). While SOO has historically served as a foundational control framework, it’s rooted in static architecture and can become a limiting constraint in adaptive environments. Metadata-driven systems offer a more flexible alternative — enabling equipment to self-optimize based on contextual overlays, dynamic constraints, and real-time feedback.
Lastly, I’d clarify that the data format of a digital twin isn’t semantic tags per se, but rather a structured system model — such as DTDL, SysML, or a metadata schema — which may include mappings to semantic identifiers like Haystack or Brick. These tags are valuable for interoperability, but the underlying model is what governs behavior, synchronization, and orchestration.
Pages 12 and 13 of this IIC whitepaper propose a “dual delivery” model for OEMs – delivering a DT with every equipment instance (asset). I believe this is the model that should be incorporated into Div 25.

This is a critical question at the heart of the transition to smart, cloud-centric buildings, and the answer is that the Sequence of Operations (SOO) should reside in both places, but in different forms and for different purposes.
The key distinction lies in execution versus optimization and verification.
Where the Sequence Should Reside
1. The DDC Controller (Execution Layer)
The actual, executable control logic must reside in the DDC controller for operational reasons.1
- Purpose: Real-Time Execution. The controller is responsible for executing the SOO immediately (milliseconds to seconds) to maintain comfort, safety, and energy efficiency. It controls the damper, valve, or fan speed directly.
- Reliability & Resilience: Control must function even if the network or cloud connection is lost. The DDC controller provides the necessary resilience to ensure the building continues to operate, heat, cool, and ventilate, regardless of external connectivity.
- Safety Criticality: The controller manages safety interlocks and critical limits.2 Relying on the cloud for these immediate, safety-critical functions introduces unacceptable latency and risk.
In short, the DDC controller holds the executable code that runs the building day-to-day.3
2. The Digital Twin (Intelligence and Verification Layer)
The high-level, semantic-rich definition and optimal representation of the SOO should reside in the Digital Twin.
- Purpose: Verification and Optimization: The Twin houses the semantic model of the SOO—the intent of the control strategy. It compares the real-time data streaming from the DDC controller against this optimal intent.
- Fault Detection & Diagnostics (FDD): The Twin uses the ideal SOO as a baseline. If the DDC is executing a command (e.g., “setpoint 72°F”) but the real-time energy use is excessive or the zone temperature is unstable, the Twin flags a Fault.
- Design-Time & Commissioning: The Twin can host a virtual version of the controller and simulate the SOO before deployment, catching errors and validating the logic (virtual commissioning).
- AI Optimization: AI algorithms in the cloud (which the Twin represents) can run millions of scenarios, find a more optimal SOO (e.g., better start/stop times, different pressure setpoints), and then push a verified, updated SOO back down to the DDC controller.
The Transition and the Future
The shift isn’t about moving the code; it’s about elevating the definition of control.
| Feature | DDC Controller (Execution) | Digital Twin (Intelligence) |
| Data Format | Proprietary code or low-level protocols (BACnet data points, register values) | Semantic tags (Haystack, Brick), high-level definitions |
| Role | Action (The “Doer”) | Verification, Optimization, & Prescription (The “Thinker”) |
| Latency | Milliseconds (Real-time control) | Seconds/Minutes/Hours (Analysis and adjustment) |
| Resilience | Operates without cloud connection | Requires with cloud connection for analysis |
Conclusion:
The control language sequence of operation remains fundamentally executed by the DDC controller.
However, the definition, maintenance, verification, and optimization of that sequence become core functions of the Digital Twin. The Digital Twin ensures the local control is not just working, but working optimally and according to its intended design, using the controller as its intelligent field execution arm.