Close this search box.

Automation Architecture Change & Bifurcation

All of computing including building and industrial automation is experiencing an acceleration of sweeping and fundamental changes.   The application of revolutionary new concepts and technologies, including the Internet of Things (IoT), sensor advances, intelligent sensors, embedded controllers, No-Code software, cloud computing and others are accelerating system architecture advances.

  • Organizational Competitiveness Opportunities
  • Highly Responsive System Architectures
  • High performance low cost Wired/wireless data networking
  • Analytics & Artificial Intelligence Commoditization
  • Game Changing Embedded AI Chips
  • My Perspective: Industrial & Building Automation Experience
  • Automation professionals need to be change agents

PERSPECTIVE: This article topically shares information from my private client consulting briefings.   If interested in a briefing tailored to your needs please contact me.  

This is part of a continuum I have been participating as a design and application engineer starting in the 1970s beginning with computer-based machine tool controls (Computer Numerical Control – CNC), Building Automation Systems (BAS), Direct Digital Control (DDC), Programmable Logic Controllers (PLC), and Distributed Process Control Systems (DCS).   Fundamental change drivers:

  • High volume use of technology in commercial and personal applications commoditizing hardware and software driving down costs, increasing capabilities improving reliability & quality.
  • Consumer products meeting harsh environmental requirements.  (i.e: IP67 rated cell phones)
  • Increased computer power at lower cost.
  • Embedded systems on a chip (SoC)
  • Embedded AI chips
  • No code programming
  • Analytics & Artificial Intelligence Commoditization
  • Large & refined open source software tools and applications
  • Wired/wireless data networking with higher performance at lower cost

Architectural Shift

The latest automation innovations have enabled users to create more responsive system architectures, which have been driving increased reliability and performance, in addition to lower software maintenance costs. Each new innovation provides even more building blocks to further evolve system architectures and spawn nearly exponential innovation in industrial automation. 

Organizational Competitiveness Opportunity

Organizations that do not research and take advantage of the appropriate disruptive automation, control, and sensor innovations are likely to become stagnant and see themselves leapfrogged by more advanced competitors.  Opportunistic companies leverage disruptive innovations positioning themselves to be leaders.  There are numerous historical examples of organizations that prospered by leveraging innovative thinking and technology.

Existing Systems

Installed systems will certainly be kept in place as long as they are productive, financially viable investments just as mainframe computers and minicomputers remained in place when they were still sensible.   Yet, the newer innovations can definitely help extend and improve these systems as add-ons which increase functionality and value.   

Change Resistance

Complacent users and many established suppliers have a tendency to view new technology additions as disruptive and unattractive.   Established suppliers for a range of reasons resist change particularly to optimize profits & investments until compelled to adopt new technologies and introduce their own version.  A case in point is the introduction of DDC (Direct Digital Control) by innovative companies and the resistance by major suppliers including producing a great deal of propaganda describing why DDC made no sense at all until they introduced their own products.  Major vendors responded when a significant number of users began adopting DDC because of its superiority over existing controls. Observation: this same DDC pattern occurred in the process control industry.

Shortened Automation & Control Lifecycle Curves

The influx of new technology overtime shortens lifecycle curves of existing systems significantly as they improve performance and have a lower ongoing cost of ownership: think version upgrades & maintenance.  When this tipping point is reached, it accelerates the adoption of new technology and decreases the lifecycle of existing installed solutions.

Again, the computing industry provides a viable example. The old model of enterprise computing required programmers to write computer code for reports and analysis based on the user’s requirements, a process which was labor-intensive and took far too long to achieve results.  End users who performed analysis on PCs using spreadsheets, were able to significantly lower the cost of accomplishing these tasks while also providing immediate actionable results.  The impact of this decreased cost, in both time and money for the business community, signaled the end of large data processing computers and departments. You can even find a prominent example in your own home. How many people no longer use cameras, given the convenience of today’s smartphone camera technology and instantaneous sharing ability?

Big Data, Analytics, & Artificial Intelligence

Big data & analytics have tremendous value which is already proven in business and industrial applications to improve operations.  The most obvious applications include predictive maintenance, optimization, dynamic resource conservation, and improved sustainability.  Many functions can be accomplished using open source software creating greater value for existing and new building control systems.

Architectural Bifurcation

Architectures are bifurcating with more real-time control and optimization accomplished in sensors and controllers at the edge and leveraging enterprise/cloud computing for business management,macro control, and optimization.

Intelligent Sensors

Smart sensors with embedded processors are being used to perform real-time analysis and make decisions changing operating parameters and directly interacting with operations and business systems.   

MEMS Game Changer

A significant game changer is low cost and high performance of MEMS (Micro Electro Mechanical Systems) chips that accurately sense pressure, humidity, temperature, acceleration, inertial measurement, and other properties.  As a design engineer at Johnson Controls I analyzed MEMS chips but the prices at that time prohibited use in products.   Since MEMS chips have been incorporated in cell phones the volume has skyrocketed and prices have gone down dramatically making them practical for a wide range of sensing applications.   

A great example is the Amazon Monitron that is an off-the-shelf end-to-end machine monitoring system that can be easily applied without analytic and machine learning knowledge by anyone including maintenance people, engineers and managers. Monitron uses MEMS sensors and a machine learning service to detect anomalies and predict when equipment requires maintenance. The IoT sensors capture vibration and temperature data machine learning cloud service that detects abnormal equipment patterns and delivers results in minutes with no machine learning or cloud experience required.    A maintenance person uses a mobile phone app receiving alerts of any abnormal equipment conditions across different machines and check machine health.  Monitron Starter Kit to wirelessly monitor five pieces of equipment is $715 USD and additional sensors: 5-packs for $575 USD. 

Edge AI Game Changer

There has been quite a buzz lately about AI chips and I’ve been following this for quite a number of years.  In 2020 I began reviewing AI chips used in embedded systems and also available on modules with M.2 and mPCIe connectors found in many computers including embedded industrial PCs adding high performance AI processing without degrading other applications in the computer. Specifications for example of the Hailo-8™ processor include 26 tera-operations per second (TOPS) with high power efficiency of 3 TOPS/Watt supporting industrial environment requirements.  Modules available conforming to M.2 and mPCIe connector/interface standard are available at less than $100 US single quantity.

This is analogous to early PCs coprocessor add-ons to achieve high performance floating-point mathematical calculation performance and video display coprocessors to achieve high resolution/performance graphics. For example, the original IBM PC included a socket for the Intel 8087 floating-point coprocessor (aka FPU), which was a popular option for people using the PC for computer-aided design or mathematics-intensive calculations.  Eventually these functions are embedded directly into the off the shelf PCs.

These chips are being used in high volume products such as video assembly line inspection and physical security driving costs down.    Currently chips are available from the following vendors… 

The architecture is accomplishing something that has been discussed in computing for years but the cost and functionality of chips was not practical in the past.  Applications are typically created in the cloud using software tools such as open source TensorFlow and ONNX resulting in an application that drives the dynamic configuration of chip resources to optimize performance. These chips contain many sets of processing elements each with control, memory, and compute which are configured and allocated to execute various layers of neural networks  optimizing the user’s  application.  If you want a deep dive this is an article I wrote in 2020

Communications to Everything

Pervasive wired/wireless communications enabling edge devices and smart sensors to transport/receive data interacting at any level in the architecture including real-time control, operations, business systems, and cloud-based optimization applications.   This is in contrast to traditional multilevel hierarchical computing model requiring field data to pass through multiple computers, and layers of middleware software before reaching the enterprise, cloud, and remote experts creating complicated brittle architectures resulting in significant increases in cost, risk, ongoing configuration control, and lifecycle investment.

New distributed model has computing at the point-of-use, streamlining the system architecture & significantly increasing system response with lower lifecycle cost.

Change Agents

This is NOT a no-brainer. Automation professionals need to be change agents, do the research and understand the new technologies to determine if they can be effectively used to improve results in their organizations.