Introduction to Edge Computing: What are the Benefits?
“Edge computing” is the philosophy that best fits OT (Operational Technology) environments due to its idiosyncrasies and needs for low latencies and real time, especially in the lower layers. We e...
Cloud? Fog? Edge? The truth is that we like to give “impactful” names to refer to ideas or different philosophies, in this case, how to use the different computing resources that we have at our disposal, and, incidentally, help marketing to stimulate our role as consumers.
The underlying idea of the Cloud paradigm is to use your applications and your data on someone else’s equipment. The connectivity, speed, and bandwidths that we currently have allow this paradigm to work: on the one hand, we have the same functionalities as having local servers but with much lower initial investment costs and with the flexibility to do and undo without having to acquire additional equipment.
The “Cloud” has been a success: it is undeniable to observe that the amount of “online” solutions that we use daily, now in a “pay as you go” format (SaaS), is infinitely higher than 15 years ago.
The idea of “Fog” is, quickly and poorly said, to try to bring the flexibility of the “Cloud” to the company: to be able to have, internally in the company without depending on third parties, a computing system that allows us to “mount” internal servers and services in a flexible and scalable way. Undoubtedly, virtualization technologies and the reduction in the cost of computing equipment have played an important role in enabling this.
And the “Edge”? As we will see throughout this Post, “Edge computing” is the philosophy that best fits in OT (Operational Technology) environments due to its idiosyncrasies and needs for low latencies and real time, especially in the lower layers.
What is Edge Computing
One of the best definitions of “Edge Computing” can be found on Wikipedia, where it indicates that it is a type of computing that offers fast and low-latency response to requests, in other words, everything that is not in the “cloud”, and specifically, all applications that require real-time data processing would be Edge.
Clearly, OT Environments are on the Edge.
Therefore, the vast majority of industrial applications that we know, SCADA, HMI, PLC control logics, etc. would be the “Edge”.
But, in this case, what’s new in the Edge? Why so much noise with “Edge Computing” and it turns out that it is the same thing we have been doing in industry for 40 years?

This is where the explosion of intelligent and “connectable” sensors that are coming onto the market comes into play, specifically the IoT/IioT. The Internet of Things has arrived and we are on our way to the “Internet of Everything”, a world where thousands of millions of objects will have a multitude of sensors to detect, measure and assess the state of the environment; all interconnected in networks (public or private).
Faced with this situation, we have a multitude of devices that collect data, but do nothing with it. The Edge Computing philosophy aims to add the computing layer, close to the sensors, to process that data and transform it into useful information to do something with it, and, with minimal latencies and in real time. This is where Edge Computing systems will shine in their maximum splendor. Thanks to the miniaturization of process technology and storage, new opportunities are opening up to relocate and make the most of this Edge Computing philosophy.
Main Benefits of Edge Computing
Security:
One of the main concerns of the IoT is that the increase in connected devices increases the exposure factor to possible attacks. The Edge Computing philosophy tries to distribute the processing and storage capacity throughout the network, which gives rise to architecting fully closed modular solutions, eliminating attack vectors that are more difficult to mitigate in Cloud / Fog Computing.
As the information will be processed internally at the edge instead of being transmitted to the central servers, there is less probability that this data will be compromised or altered.
The distribution of computing and storage power in different “mini data centers” also makes DoS attacks more difficult to succeed.
Performance:
speed and low latency in continuous communications to allow real-time responses to give the appropriate response to the stimuli detected by the sensors.

Scalability:
Edge computing offers less expensive and flexible scalability by allowing computing capacity expansions to be carried out with the combination of IoT devices and local “mini data centers”.
Autonomy:
The fact of having data processing close to where the data is produced eliminates factors that could affect the exploitation of said data, for example, communications or the correct functioning of the central servers.
Paradigmatic Example of the Use of Edge Computing
The paradigmatic example of the use of Edge Computing is that of autonomous cars.
We can understand the car as a “data center on wheels” with a multitude of sensors to collect data from the environment, the road and the state of the vehicle itself + processing equipment for these to decide what action to take: accelerate, brake, turn, lights, activate the windshield wipers, etc.
According to Intel, an autonomous car can generate about 4 TB of data per day and may require bandwidths of 20-40 Mbps for the transmission of images from the cameras it may include.
All these calculations must be executed in real time and in a minimum time to avoid possible accidents, so it is clear that Edge Computing is the only viable paradigm for the most critical systems of the car.
Mounting a Fault-Tolerant “Edge” System
There is a factor that is important to highlight: for the correct functioning of the local processing of the data, it is mandatory that said processing be Fault Tolerant and a high availability of the systems is guaranteed.
In the previous example, it is not necessary to say the consequences of hardware failures that supports decision-making: accidents.
The computer equipment integrated in the Edge must comply with the strict performance requirements as well as offer constant reliability. Stratus ztC Edge equipment can be quickly incorporated into the Edge in an easy and robust way, adding a fault-tolerant virtualization layer, ensuring that the process layer works uninterrupted even if there are physical hardware failures.

Its main characteristics are:
- Built-in virtualization: run virtual machines in high availability or fault tolerance. Guarantee an execution without zero steps.
- Automated protection: the solution is composed of redundant nodes with continuous data synchronization, all automatically.
- Automatic recovery: in the event of a breakdown, the execution of the systems continues in the redundant node, without the need for anyone’s intervention.
- Robust hot-swappable nodes: ztC Edge nodes are designed to work in demanding industrial environments.
- Control and administration of the system status: ztC Edge nodes have monitoring and control services to detect any anomaly that may occur.
You already know many things about Edge Computing, if you need to expand information or resolve doubts, contact us!





