Award winning manufacturer of IT-based building automation.
Turning Big Data into Building Intelligence
Dynamic Machine Learning Comes to Commercial Building Controls
|Dr. Igor Mezic,
Chief Scientific and Technical Advisor, Co-Founder
John D. Morris,
Marketing and Sales
automation systems (BAS) have long held the promise of networked
equipment, continuous self-reporting, and optimized operation. Their
very name implies a substantial degree of autonomous intelligent
action. Yet, as many owners have been shocked to learn, their big
ticket BAS is far from a brain for the building. Rather, it is more
akin to a very twitchy nervous system.
recent buzz around the Internet of Things and the value now being
derived from big data analytics has sparked great interest in applying
new technologies for smart buildings. The opportunities are staggering:
detailed feedback of space utilization and productivity, tracking field
equipment performance, predictive maintenance, smart demand response,
and, of course, the prospect of substantial energy savings. Here we
discuss what has limited widespread adoption of big data analytics for
buildings in the past, and Ecorithm’s solution for addressing those
issues by incorporating dynamic machine learning and domain expertise
into its True Analytics™ Platform.
Taking Promise to Reality.
some sense, buildings are an original prototype for the Internet of
Things: an intranet of many disparate devices capable of two-way
communication within a Building Automation System. In fact, it seems
like the technology should have matured to the point that the vision of
intelligent buildings should have already become reality. Direct
Digital Controls came into being in the Old Geology Building at the
University of Melbourne in 1981/1982. With the advent of LonWorks in
1989, LonMark in 1994, and BACNet in 1995, the field bus was born in
building controls and achieved nearly 100% adoption over a five-year
period. Yet, in the 20 years since, building systems have largely
remained siloed and equipment data has remained uncollected and
In the meantime, aircraft autopilot systems have become fully capable of landing a 747 in a 30 mph cross wind, smart phones have become ubiquitous giving users portable, instant access to seemingly limitless amounts of information, and self-driving cars have a significantly better driving record than we do. (Sadly, personal jet pack technology still appears to be a long way off – and drones are rapidly replacing those jet pack functions.)
So, if buildings have been generating data for so many years, then what has slowed the conversion from prototypical Intranet to intelligent Internet? The answer is that analysis software had not been sufficiently mature to make sense of it until today. Because of changes in weather, occupancy, use, and configuration, the optimal settings for one day in one building are almost assuredly different from day to day and from building to building. This has made it difficult to apply software tools to automated building analysis. These constantly changing variables represent the ultimate challenge for building analysis, requiring intensive engineering effort in order to implement and use analytical tools. The upfront investment, inflexibility, and heavy human requirement to sort through false alarms has largely curtailed widespread adoption of analytics technologies until today.
The Unique Challenge of Buildings
Buildings are exceptionally diverse and complex systems. Where planes, phones, and cars are discrete systems with comparatively little variety, buildings are highly dynamic environments and no two are alike in structure or operation. Additionally, with the goal of improving profitability and the quality of occupant experience, integration of new mechanical technologies – such as air-side economizers – has dramatically increased the complexity of building operations. Following the second law of thermodynamics, this increased complexity has made it increasingly difficult to maintain order of the systems. Lawrence Berkeley National Laboratory (see figure 1) has verified that, in the absence of monitoring based commissioning, building performance quickly regresses.
an attempt to resolve this issue, a number of approaches were developed
over the last several years to provide engineers with dashboards and
analytical tools for monitoring based commissioning. The first
generation of building analytical tools has focused on
rules/alarm-based approaches, where a warning is issued as a building
system strays beyond the parameters programmed by the software’s
installers. Since the software can’t ‘learn’ without human engineers
tuning the software manually to improve performance, it doesn’t get
more efficient over time. As a result, the performance of the software
is only as good as the programmer in the field and their knowledge of
that individual building.
This inflexibility creates two huge limitations: a long, expensive startup and many false alarms. Since every building is different, each rule set must be customized to the specific configuration and sequence of operations of an individual building. This process may take many weeks or months before any value is derived from the installation. Additionally, the rules-based systems – since they simply flag issues that are ‘out of range’ – can generate multiple alarms whenever an issue arises with a central plant or mid-point device, resulting in ‘alarm fatigue’. This influx of alarms actually detracts from the effectiveness of the engineering staff (as they chase false positives) or leads to the staff ultimately tuning out the alarms altogether.
To better understand the limitations of the rules based approaches, consider a very simple system: a table.
Software is exceptionally good at following rules and scrolling through an immense amount of data to either find entries that break those rules or to find an exact match to a query. The challenge comes in knowing exactly when and how to apply those rules. For example, look around the house and notice all the different types of tables there are: dining room tables, end tables, coffee tables, etc., of all shapes, sizes, and compositions. Now think about how you would explain to a computer how to identify each of these tables and detect faults, such as a broken leg, for each type. What is simple and intuitive for humans is surprisingly difficult for software. The process requires a learned understanding of what constitutes a table and the elements that are required for that particular table to function. The computer must be told about the wide variety of shapes and sizes of tables, and how to differentiate a functioning table from things that resemble broken tables. Now, consider how annoying it would be to have the software send an alarm each time it encounters a new, table-like object – such as a pedestal table, a three-legged stool or a wooden plank – and then consider the vast array of devices in a building and the multitude of conditions they can be operating under. Now, you can begin to picture the arduous process of setting up a building, and the resulting alarm fatigue that has inevitably followed.
an effort to address these limitations, some second generation building
analytics technologies are starting to incorporate machine learning.
The field of machine learning has made great strides for the static
table example: as present day machine learning software encounters more
types of tables, it is capable of developing additional rules to define
what is in fact a table and what faults it may have. Unfortunately, a
building’s dynamic systems provide a much greater challenge. What if
the definition of a faulty table changed on a daily basis? The number
of rules must multiply exponentially to keep up. And, since rules are
learned after the fact, this static machine learning is still likely to
create false alarms than provide correct diagnostics.
So, if it’s so difficult for software to analyze dynamic systems like buildings, even with access to massive amounts of data, how are humans able to manage buildings at all?
The Secrets of the Building Whisperer
is the notion of the “building whisperer” – an engineer with so much
skill and experience with a building that he can sense problems and
their underlying root causes, just from walking the halls and knowing
precisely where to look in the BAS data. Does the secret of the
building whisperer lie in a preternatural ability to somehow sense and
process vast amounts of data? In fact, the answer is the opposite. The
building whisperer simply has a finely tuned human trait of recognizing
and connecting complex patterns, which, through their years of
experience learning the complex interrelationships of the building
systems, enables them to disregard the massive amount of irrelevant
data and quickly identify the important information, pinpointing the
most likely underlying cause.
question becomes how software can be applied to be an extension of the
building whisperer – adept at quickly identifying and prioritizing
critical issues – rather than an alarm system that simply points to out
of range values, requiring human analysis to decide whether action is
Ecorithm: True Analytics Platform, Dynamic Machine Learning with Built-in Domain Expertise
highly complex, the behavior of building equipment is far from random.
The building reacts to the 24-hour weather cycle, the weekly occupancy
cycle, and the seasonal cycles. The responses to these cycles have
distinct physical signatures, and are therefore recognizable. However,
it becomes complicated when all of these cycles are overlaid,
ever-changing (as with weather and occupancy), and interacting with one
another. As a result, there isn’t an exact pattern that is easy to
detect, making it difficult to differentiate between a healthy and
unhealthy cycle using traditional big data analysis techniques.
has developed its analytics platform based on the Big Data Dynamics
branch of machine learning, where the dynamics of component
interactions can actually be used to our advantage. Where the building
whisperer relies on expertise and experience to sniff out anomalies,
the True Analytics Platform uses dynamic pattern recognition to convert
the otherwise overwhelming output of overlapping cycles (Figure 2) into
readily machine readable spectral patterns similar to a bar code
(Figure 3). A layer of built-in domain expertise then enables automated
root cause analysis based on the combination of these dynamic patterns
throughout the building, and prioritizes the resulting faults and
optimization opportunities by persistence and significance in energy
savings, comfort, and maintenance value.
There are several key benefits to this approach. From a scalability standpoint, the built-in dynamic pattern recognition and domain expertise means that the startup time required to tailor the software to each building is measured in hours and days, rather than the weeks or months for rules-based approaches. Perhaps most critically, the Platform reconciles the “demand side” of the system with the “supply side” of the system. This means that faults and opportunities for optimization of the central plant are balanced against, and informed by, the existing faults within individual zones. This has the dual impact of enabling both system level root-cause analysis and system level optimization. Finally, the automated analysis and prioritization of results eliminates alarm fatigue and provides for easy customization and interactive feedback.
ongoing analysis and diagnostics of buildings is a complex
challenge. While dynamic machine learning, spectral pattern
recognition, and system-level analysis sound complex, in combination
they provide an elegant solution to the very intricate challenge of
building analytics. Grounded in principles that enable “building
whisperers” to rapidly diagnose issues, the True Analytics Platform
extends the vision of great engineers, enables extreme scalability for
building and portfolio analysis, and opens the door to intelligent
[Click Banner To Learn More]
[Home Page] [The Automator] [About] [Subscribe ] [Contact Us]