Craig Beddis, CEO and co-founder of Hadean, explains why digital twins need the edge to really thrive
The centralization of twin digital technology poses challenges that the edge can mitigate.
Digital twins have evolved to become all-digital replicas that encompass anything from a single object to an entire industrial process. They incorporate and synthesize multiple layers of information to react in the same way as their physical counterparts. However, creating these digital twins with the highest level of detail requires reflecting the volatile nature of their properties. Supply chains can be disrupted by changing resource availability; factory processes can be affected by pressure and temperature. Our physical world is never static, and how things change is an essential part of your makeup. There are factors involved that are time sensitive and therefore representing these entities in the utmost detail requires simulating their constant flow of change.
The cloud infrastructure has served to provide an excellent platform on which disparate data sources can be unified and synthesized. But the centralized nature of the cloud acts as a double-edged sword, as sending and analyzing data to a remote location can cause latency issues. When there is a slowness at this time, sensitive data, entities, or processes run the risk of being displayed incorrectly. Crucial business moments can be lost, batches of products can be ruined, and energy wasted, leading to high costs. The benefits of cloud infrastructure are too great to give up, but a solution to their latency problem is absolutely essential to enable digital twins to reach their enormous potential. This is where edge computing begins to become the next crucial piece of the puzzle.
Gartner recently described this relationship: “The centralized cloud was the last turn to centralized environments. It focused on economies of scale, agility, manageability, governance and platform service delivery. However, centralizing services to a cloud environment or other types of basic infrastructure omits some functions that are quite desirable. Deployments on the shore offer additional features that complement the capabilities provided by basic infrastructure. “
What does the future hold for cloud and edge computing?
Filippo Rizzante, CTO of Reply, discusses what recent research shows about the future of cloud and edge computing. Read here
Edge servers process data closer to the source, reducing latency issues that occur when sending data to the cloud. This new topography eliminates the delays created by physical distance. Processing is allowed mainly through peripheral network data centers. These are servers distributed more frequently around the world compared to cloud, which are often placed in more remote locations for a lower cost. Local devices on peripheral servers can connect, send, and receive data without having to communicate with cloud servers located further away.
Suppose, for example, that you have a sensor on a valve in a factory. The sensor collects data on pressure and temperature, which it can then send for analysis. High latency can be a problem if the valve needs to be adjusted. If the pressure is too high and the valve does not adjust in time, an explosion could occur. These security issues are key, but it also extends to process optimization. Adjusting the valve to real-time data could reduce power consumption during downtime. By connecting to the Edge servers, the valve control can be adjusted much faster.
While it’s a great opportunity, creating competent advanced networks to support digital twins includes its own set of challenges. First of all, the large number of such IoT devices has caused a lot of noise. That is, managing all the data provided by them can be overwhelming. Much of the information varies in its relevance and value, so any peripheral infrastructure must provide a system that can interpret it effectively. A network relevance solution could solve this by using spatial data structures to efficiently query what information is relevant to each client. It is planned that the entities that are using these queries will be sent to the customer based on metrics that correspond to the importance of this entity, which allows less important entities to be sent less frequently and reduce bandwidth.
How to control access to IoT data
Access to data-generated Internet of Things (IoT) data can be difficult to control, but how can companies secure it? Read here
IoT devices that make use of edge computing have been a crucial part in creating realistic digital twins. The simulation infrastructure, however, must include a sophisticated network model to connect them all. Without one, systems may be prone to crash when they have a limited number of active connections. Having an asynchronous architecture can deal with this problem effectively. It handles the actions of thousands of devices and the distributed load balance ensures that the network always has enough CPU to handle large flows. This eliminates the need to have a single thread per device and manages the control events sent by each, forwarding them to the simulation; events are reconstructed into a complete world state and then stored in a data structure.
With the sharing of so much data, it is also essential that peripheral networks be able to cope effectively with rising demand. A distributed load balancing system would ensure that the network always had enough CPU to handle large flows without crashing. In addition, the system must deal with the registration, analysis and debugging of processes on the network. A network viewer, for example, can provide a wealth of information about the connection between them. Not only latency and bandwidth, but also detailed statistics, such as lost packets, window sizes, or time elapsed since last sent or received.
The importance of advantage in practical terms cannot be overstated. Weisong Shi, a professor of computer science at Wayne State University, Weisong Shi spoke about the benefits of edge computing at Wayne State University: “By enabling edge computing, crucial data can be transmitted from the ambulance to the hospital in real time, saving time and assembling emergency department equipment. with the knowledge they need to save lives. “
Creating today’s complex and dynamic digital twins requires reflecting the volatile nature of our modern systems. Cloud computing offers accessibility to a high level of processing power and perimeter computing is ready to fill the gaps.