top of page
  • Writer's pictureDaymon Thompson

Bandwidth Bottlenecks Limit Insights — Edge Analytics, Part 1

Industrial PC advances and cloud bandwidth considerations make the case to analyze machine performance on controllers before the cloud

Beckhoff TwinCAT Analytics and IoT

The debate between cloud and edge computing strategies remains a point of contention for many controls engineers. However, most agree that smart factories in an Industrie 4.0 context must efficiently collect, visualize and analyze data from machines and production lines to enhance equipment performance and production processes. Using advanced analytics algorithms, companies can sift through this mass of information, or big data, to identify areas for improvement.


To some, edge computing devices may seem to create an unnecessary step when all data can simply be handled in the cloud. Microsoft Azure, Amazon Web Services (AWS) and other cloud platforms offer limitless space for this purpose. Moreover, MQTT encryption and data security built into the OPC UA specification ensure that all data remain secure while in transport. When it comes to analytics and simple data management, however, edge computing presents important advantages to closely monitor equipment health and maximize uptime in production.


Because of the massive amount of data that modern machines can produce, bandwidth can severely limit cloud computing or push costs to unacceptable levels. New analytics software solutions for Beckhoff PC-based controllers allow controls engineers to leverage advanced algorithms locally in addition to data pre-processing and compression. As a result, a key advance in analytical information is the concept to process data on the edge first, which enables individual machines and lines to identify inefficiencies on their own and make improvements before using the cloud for further analysis across the enterprise.


Bandwidth burdens when streaming machine data


Beckhoff edge computing technologies
The massive amount of production data going to the cloud requires significant bandwidth.

Depending on your service plan, running all analytics in the cloud can be expensive in terms of storage space, but the more difficult proposition is first getting your data there. Managing bandwidth can create a serious issue for factories, since the average Ethernet connection speed across the globe is 7.2 Mbps, according to the most recent connectivity report from Akamai.


When one machine sends data to the cloud, much less multiple machines, little to no bandwidth is available for the rest of the operation. Two use cases published in a 2017 article by Kloepfer, Koch, Bartel and Friedmann illustrate this point. In the first, the structural dynamics of wind turbines using 50 sensors at a 100 hertz sampling rate required 2.8 Mbps bandwidth for standard JavaScript Object Notation (JSON) to stream all data to the cloud. The second case, condition monitoring of assets in intralogistics, used 20 sensors at a 1,000 hertz sampling rate and required 11.4 Mbps JSON. This is quite a relevant test as JSON is a common format to send data to the cloud or across the web.


Without compression or pre-processing mechanisms, an average 7.2 Mbps Internet connection cannot stream data from three or more large machines or from a full logistics operation that requires advanced measurement, condition monitoring and traceability of production. A factory must use a connection that is much larger than normal or multiple connections, or it can leverage advanced analytics on the edge.


Want to learn more about applying edge computing technologies to your machines and systems? Contact your local Beckhoff sales engineer today.


 

Daymon Thompson of Beckhoff Automation USA

Daymon Thompson is the Automation Product Manager for Beckhoff Automation LLC.


A version of this article previously appeared in Control Engineering.

bottom of page