BTL Mark: Resolve interoperability issues & increase buyer confidence
Buildings connected with open protocols to the powerful internet cloud and its web services are redefining the building automation industry, with the result that the reach and the visibility of the industry have never been greater nor has change been so rapid. Our clouded future includes new virtual connections to buildings from the communities they are part of with both physical and social interactions. An example is digitally displayed energy/environmental dashboards to inform all of the building’s impact in real-time energy use, plus the percentage generated from renewable sources. And connections to the smart grid make buildings a physical part of their supply energy infrastructure.
The ability to operate buildings efficiently via the internet cloud from anywhere allows the building automation industry to be better managed and appear greatly simplified. Web services, or software as a service (SaaS) as it is sometimes called, coupled with powerful browser presentation is changing how we appear and interact with clients.
Building Information Model (BIM) software allows the power of visual relational databases to improve decisions throughout the building design. And new visualization and simulation tools reveal the effects of decisions made prior to the commitment of funds. In similar fashion, cloud computing provides a collaborative process that leverages web-based BIM capabilities and traditional document management to improve coordination.
The Data Cloud for our industry has become real. As we see applications and services moved “off-site”, you can imagine the opportunities for managing real estate, reducing energy and providing value-added applications for buildings.
We must unhinge our minds and find new pivot points from which to build our future. We must embrace the power of the cloud while increasing our comfort level in using the solutions within.
Cloud computing is the term used to describe the use of interconnected business applications over the internet. The application and infrastructure does not reside in end users premises, instead, the end user accesses the application on demand via a web browser. This means he can concentrate on using the application for its purpose, without investing in capital expenditure and avoiding the overhead of installation, networking and maintenance.
While cloud computing has the potential to be the next major driver of business innovation across all industries allowing more dynamic, resilient and cost effective IT systems to organizations, as with any new technology, it is difficult to distinct the reality from the hype to form a clear strategy to capitalize on it.
The building automation industry has generally been a slow player when adopting new technology. Over a decade ago, the majority of building automation systems (BAS) were proprietary and regarded as just another tool required for operation of a building and more or less non-existent to the outside world. However, with the emergence of open system protocols such as BACnet and LonWorks and the worldwide emphasis on energy management and sustainability, the rate of adoption of new technology by building automation vendors has increased dramatically, in particular, the use of web technology and open system architecture to integrate and converge with IT networks to create new features in a more cost effective and time efficient manner.
As the cloud computing wave hits the market, it is up to the IT executives of the building management system vendors to adopt the technology into BAS with a clear strategy.
To illustrate the cloud computing model in action in the BAS market, let’s examine a case study of a global hotel’s requirements of improving the comfort level for its guests and reducing in energy consumption and carbon emissions. The characteristics of the requirements as follows:
• Globally situated hotel with 100+ branches
• Executives require summary data of energy, water and carbon consumptions
• Facility mangers at each location require detail usage data for comparison
• Require proactive actions to be taken to meet key performance indicators (KPIs)
• Hotel has an enterprise resource planning (ERP) software managing the financials
• Costs associated to with energy must be allocated to each branch
A conventional IT environment has control networks installed at each branch and at each location a server running the software with a database to monitor and control. Therefore it requires 100+ servers, which is a considerable capital investment. For reporting requirements, specialized software needs to written to merge the databases and link with the ERP system to allocate costs. It is difficult to proactively manage the KPIs, as the system struggles to obtain the required data at the right time.
In a cloud computing environment, however, controls networks are installed at each branch but the software and database at just one location, which saves real estate and power consumption. The executives and facility managers access the required data via web browser from their local destination. The dynamics of the cloud:
• The BAS interfaces with utility services to obtain real-time data for power, water, gas consumption for each building, for comparison against actual data obtained at each site.
• The BAS interfaces with the ERP system’s general ledger to allocate costs for each branch’s usage of energy.
• The BAS interface with social networks such as Twitter and Facebook to inform the hotel’s staff and customers about energy usage and carbon emission to encourage further savings.
• The BAS interfaces with business intelligence software, which generates custom reports required for executive, management and user levels.
The data obtained from the BAS via control networks for each branch is linked to the intelligence software via web service to generate the reports. As more data is gathered by the intelligence software, it performance data mining to proactively report actions required to meet the key performance indicators. The facility managers can compare different branches performance and set benchmarks to improve on KPIs. The users access the end result of all this integration a using simple web browser.
Connectivity of everything is a growing reality, and with each new connection comes new opportunities and new perspectives. Just as low-cost powerful connectivity is changing and actually simplifying our personal lives with internet extensions (i.e. “apps”) to our handheld devices, building automation is caught up in the same connectivity growth.
The process of collecting and mining data is the heart of automated continuous commissioning. ACC uses access to the existing building automation system and data from traceable external sources (such as NOAA weather data) for this new class of analysis. The data is then used to create performance models of each piece of equipment to track actual (versus design) operation.
New techniques have emerged to create models that persistently predict actual performance within a two percent margin of error. By leveraging these models, facility managers have a powerful means to diagnose and control system anomalies.
The level of granularity provided by an ACC system can identify anomalies that can be generally categorized into three basic groups: Control, Maintenance, and System Performance Degradation.
Persistent monitoring and diagnostics of system operations directly impacts sustainable energy efficiency in commercial buildings. Examples include everything from detecting heating and air conditioning programming errors to identifying out-of-adjustment settings on control systems, improperly balanced parallel chillers that cause unwanted surges, high head pressure on rooftop unit compressors, oscillating controls that cause unnecessary heating and cooling run times, and incorrect refrigerant charge.
In today’s complex buildings, even small problems can have big impacts on building performance. Lighting, heating, ventilating and air conditioning systems need continuous performance tracking to ensure optimal energy efficiency. Yet, a formal process for data gathering and analysis is not commonplace in the nation’s building stock. Plus, there’s often a disconnect between the energy modeling done in isolated, one-time recommissioning or energy audit projects, and what happens in day-to-day operations.
What’s needed is a systematic approach to tracking energy utilization that helps detect problems early, before they lead to tenant comfort complaints, high energy costs, or unexpected equipment failure. That’s why new robust energy monitoring technologies and Monitoring-based Commissioning (MBCx) techniques are now at the forefront in building energy management.
MBCx has the potential to keep buildings running at peak efficiency by addressing the “performance drift” which occurs when building systems fall out of calibration or fail altogether. A sensor network gathers discrete data measurements and with analysis capabilities identifies trends, detects leaks and alerts building engineers to hidden problems that waste energy.
A recent Lawrence Berkeley National Laboratory study revealed that MBCx is “a highly cost-effective means of obtaining significant energy savings across a variety of building types”. The program combined persistent monitoring with standard retro-commissioning (RCx) practices with the aim of providing substantial, persistent, energy savings. There were three primary streams of energy savings from the MBCx project:
1. Savings from persistence and optimization of savings from RCx thanks to early identification of deficiencies through metering and trending. Several studies have shown that RCx savings can degrade without an explicit effort to monitor them.
2. Savings from measures identified through metering and trending during the initial commissioning effort (measures unlikely to be found from RCx alone). Examples of such measures include: poor control of chilled water distribution to air handlers; unnecessary chiller operation due to disabled chiller lockout; poor VAV zone control due to inoperative actuators on dampers and valves.
3. Continually identified new measures. By virtue of the continuous nature of the monitoring,MBCx can identify new problems that emerge after the initial retro-commissioning investigation stage, such as equipment cycling and excessive simultaneous heating and cooling.
Via extensive discrete measurements, MBCx can provide insight into how a building is actually functioning, and if equipment is starting to fall out of spec. With an integrated picture of all key building components, operational deficiencies that would normally go undetected can be identified. More important, new procedures can be put in place to maintain a high-performance building.
With improved insight into a clients’ ongoing operations, commissioning providers can recommend set point adjustments, design new alarms, and make sure energy savings last over the long haul. The bottom line? MBCx technology and techniques can ensure that energy efficiency gains do not degrade over time – a win for clients and for vendor-client relationships.
The operations center is where technicians, engineers and management monitor, manage and troubleshoot issues. The operations center monitors building performance, systems configurations, policy implementation, scheduling, report generation and documentation. At the heart of an operations center are the “human factors”. This may sound like some mushy soft science, but there is a well-recognized scientific discipline called human factor engineering. It is utilized to address the environmental design of an operations center, ergonomics, re-engineering of operational processes and the human interface to the technology.
There is a tendency to focus on the technology in the operations center rather than the human factors (who isn’t wowed by a video wall of high-definition plasma displays?), however, the focus on the bells and whistles misses the underlying premise that technology is simply an enabler and should be used to change the behavior and operations of the people using it.
We always start with the premise that improved management of buildings requires improved monitoring of the building and building systems, (i.e. gathering data through sensors, meters, surveys and other means). This exponentially increases the volume of data available to building and facility managers. However, additional data does not necessarily provide “actionable information” that will result in improved operational performance.
The continuing question is how to convert data into meaningful information that is contextual and actionable. The operations center is an environment where meaningful information can be extracted and presented to produce a high level of situational awareness, align related work processes, minimize workload and errors, enhance task performance, and provide information and reporting tools required to manage the building’s operations.
Ken Sinclair is Editor/Owner of www.AutomatedBuildings.com. This article also includes contributions from Nirosha Munasinghe of Open General and Peter Sharer of Agilewaves
[Click Banner To Learn More]
[Home Page] [The Automator] [About] [Subscribe ] [Contact Us]