As we have learned from the whitepaper 'Edge Computing in Industrial Environment' about the overview and different forms of Edge computing, we will discuss here more on how Edge is deployed in an Industrial environment considering its scalability, reliability, latency and data/bandwidth, and discuss the supporting IEEE standards.

One of the most important factors for any industrial smart factory is how to deal with the increasing demands for managing massive amounts of sensor data inputs and provid the services that today's control systems can offer. As we have discussed in the section "what is Edge computing?" of the whitepaper, the answer to squaring this circle is to embrace edge computing as part of an industry 4.0 strategy. Edge computing is a cloud-based intermediate layer that connects the central cloud and the edges, providing specialised services using hardware and software.

Deploying Edge

The key components of Edge computing are Cloud (It can be public or private cloud, which has a repository for the container-based workloads and also to host & run the applications), Edge devices (equipment along with sensors having limited compute resources), Edge node (referring to any edge device, edge server, or edge gateway on which edge computing can be performed), Edge server (typically used to run enterprise application workloads and shared services), and Edge Gateway (able to host enterprise application and provide network services). Edge devices are physical hardware situated at the network's edge that have enough memory, processing power, and computing resources to gather, analyse, and execute data in near real time with just a little support from other parts of the network. Different edge devices offer different levels of processing, these can also filter data, ensuring only important changes are sent to the cloud for further analysis. Besides, Edge is more fault tolerant, and corrections can be made immediately. Even in scenarios with less Signal strength, Edge would be able to independently support the devices.

Containerization is one of the most common ways to make programs compatible with cloud usage. The program is packed together with all the operating system libraries it requires using technologies from suppliers such as Docker, and the complete container is transferred from server to server as needs change. This move is often carried out with the help of Kubernetes and similar tools, which monitor hardware availability and other factors to determine when and where containers should run. The centralised cloud computing structure is increasingly inefficient for processing and analysing enormous volumes of data gathered from IoT devices due to data transfer with restricted network capacity. The preprocessing methods greatly minimise the amount of data transported since edge computing offloads computing duties from the centralised cloud to the edge near IoT devices.

As shown in figure, IoT Connect® Platform enables the system to quickly connect, collect and generate valuable insights from data across enterprise. The IoT Connect platform consists of various components such as tools, technologies, SDKs, APIs, and protocols. It provides a matrix of devices, sensors, gateways, actuators, and other modules. These devices collect various types of data at different intervals that can then be monitored, filtered, and processed in real-time to provide you with actionable insights. One can also create new revenue streams and service models by quickly deploying solutions that scale across production environments.

IoT connect ecosystem
Figure: IoT Connect ecosystem

IEEE Standards for Edge Computing

There are few issues to date which remain unsolved, as they aren't adequately stated and Edge Computing isn't fully defined yet. Plus there are challenges to be faced whilst deploying Edge computing processes such as managing the entire system, defining and creating Fogs and Edges, providing workflow, and providing Edge Computing (computing, storage, and networking) services.

The IEEE International Conference on Edge Computing (EDGE) has an aim in establishing itself as a premier international forum for researchers and industry practitioners to discuss the most recent fundamental advances in the state of the art and practise of Edge computing, identify emerging research topics, and define the future of Edge computing. EDGE is responsible for localised resource sharing and cloud connectivity. Below are few of the IEEE standards defining towards EDGE.

IEEE P1935

Standard for Edge/Fog Manageability and Orchestration

This standard defines and harmonises edge/fog computing manageability, management, and orchestration. It covers the functions and services of ambient, self-aware resources' capacity, manageability, management, and orchestration. It also specifies an ambient resource's membership and behaviour within an Edge/Fog community or neighbourhood, as well as the structure and purpose of the Edge/Fog community

IEEE P2805.1

Self-Management Protocols for Edge Computing Node

This standard specifies the self-management protocols for edge computing node. These requirements include management protocols of self-organization, self-configuration, self-recovery and self-discovery between multiple edge computing nodes.

IEEE P2805.2

Data acquisition, Filtering and buffering Protocols for Edge Computing Node

This standard specifies the protocols used by the edge computing node of buffering, filtering and pre-processing data collected from controllers including programmable logic controllers, computer numerical controllers (CNC) and industrial robots. The data acquisition from field devices with different interfaces must be automatically stored, filtered and calculated accordingly from industrial clouds and other edge computing nodes.

IEEE P2805.3

Cloud-Edge Collaboration Protocols for Machine Learning

This standard specifies the collaboration protocols of enabling machine learning on the edge computing node with support from industrial clouds. It provides implementation reference for machine learning on lower powered, cheaper, embedded devices.

IEEE P2961

Guide for an Architectural Framework and Application for Collaborative Edge Computing

This guide defines a machine learning framework that allows a computing task to be decomposed and distributed across edge and cloud nodes. It defines the architectural framework and application guidelines for collaborative edge computing. It also provides a blueprint for data usage, model learning, and computing collaboration in edge computing environments.

Stay connected with our other inspired blogs where we have discussed different forms of Edge computing, like Thick, Thin and micro-edge solutions. Integration of Edge computing in PACs and PLCs, Edge computing security risks and solutions.

ShareTweetPost

Stay informed


Keep up to date on the latest information and exclusive offers!

Subscribe now

Data Protection & Privacy Policy

Thanks for subscribing

Well done! You are now part of an elite group who receive the latest info on products, technologies and applications straight to your inbox.

Technical Resources

Articles, eBooks, Webinars, and more.
Keeping you on top of innovations.