As Internet of Things (IoT) devices proliferate and incorporate more processing power, large amounts of data are generated on the edge of computer networks. IDC predicts that by 2025 there will be 55.7 billion connected devices worldwide, 75% of which will be connected to an IoT platform.
Traditionally, data produced by IoT devices was relayed to the cloud, processed, and additional instructions sent to Edge devices. Although this configuration is unproductive, as it creates inefficiencies with speed and latency.
Perimeter computing filters and processes data closer to the source, sending only relevant data to the cloud. This minimizes bandwidth and cloud storage costs associated with data derived from IoT devices, notes Walid Yehia, senior director of Presales for MERAT, Dell Technologies.
In addition, because many industrial IoT applications also require critical real-time sensor responses, network outages cannot be risked. This is especially true in remote locations, where network connectivity is not always available. “As edge computing capabilities are becoming a critical component of IoT platforms, they are deploying deployments. The IoT spreads over many industries and generates a lot of data from connected devices, with the presence of advantage in the IoT that begins to create new opportunities and business value with reduced costs and real-time decision-making. says Yehia.
While supporting ubiquitous access, cloud data centers are only slightly more distributed than local data centers. In contrast, the advantage allows organizations to deliver applications closer to users.
“In many ways, the avant-garde is just the next step out in an expanding universe of distributed applications, with advantages and disadvantages aligned with those of multi-cloud strategies,” says Lori MacVittie, lead technical evangelist. ‘F5 CTO Office.
“Data analysis represents a case of basic computer use, which provides insight into the knowledge needed for digital transformation initiatives,” MacVittie adds.
AI has traditionally resided in data centers, where there is enough computing power to perform cognitive tasks that require processors. This works well when immediacy is not paramount; the problem is that more and more applications require instantaneous or near-instantaneous reactions to the information they provide.
“Moving that front of the application information collection to the limit and applying artificial intelligence to the same point, allows artificial intelligence systems to use inference (such as intelligence). artificial uses observation and background to reach a logical conclusion) to make decisions faster. says Joe Baguley, vice president and technical director of VMware EMEA.
Widespread 5G deployment and cutting-edge computer deployments will go hand in hand, as both drive and benefit each other. With almost ten times the speed of 4G, 5G is set to unlock numerous potentials in many industries. “With its capacity and bandwidth to support billions of connected devices, new applications for sensors and connected devices will emerge that will increase the demand for peripheral devices that can process, analyze and transmit data in real time,” he says Yehia.
Similarly, edge computing is essential to help 5G reach its full potential by solving the latency problem. “Fast network performance is a necessity for 5G when connecting numerous devices, especially when there are AI applications, such as smart cities or for autonomous vehicles that require feedback in milliseconds,” he notes.
Gartner predicts that about 75% of the data generated by the company will be created and processed outside of a traditional centralized data center or cloud by 2025. “As we continue to see more advanced deployments, the combination of the two technologies [edge and 5G] it will be a change of game “, says Yehia.
With the decentralization of computer technology, the shift of workloads from the cloud to the limit exposes a larger surface to cyber threats. “For edge computing, all devices can be seen as an entry point. This requires the need to protect data on the edge, with a plan that includes maintaining business and service continuity despite one or more peripheral sites being compromised, ”warns Yehia.
Measures need to be put in place beyond network security and the endpoints that vendor companies can rely on. Designs, standards, processes, and best practices aimed at minimizing the risk of data loss should be incorporated into the process from the outset, it recommends. “In addition, protecting data at the edge can lead to the creation of an independent network fabric for data backup operations, including backup, restore, archive, and snapshot. With well-done security, advantage
computing can be more profitable than risky. “
Perimeter computing can also mitigate some of the security vulnerabilities inherent in cloud infrastructures. With the public cloud, provider security is guaranteed and organizations do not have much control over how their data is managed, as the cloud is shared with other users, Yehia notes.
“The issue of privacy and compliance with sensitive data (especially in the financial and healthcare fields) is best solved with advantage, as organizations have more control over their data, access and security by filtering data at source.”
he says. “In addition, because data is processed in situ with perimeter computing, this minimizes its risks for distributed denial of service (DDoS) attacks and other vulnerabilities, such as network outages and power outages,” adds Yehia .
CASES OF USE
Peripheral computing has the potential to transform the healthcare, retail, transportation and logistics, gaming and surveillance and surveillance industries. These sectors are increasingly advancing towards artificial intelligence and machine learning applications, while generating tons of data through devices and sensors, necessitating real-time feedback and information. “The increase in use cases to bring processing closer to the data source is especially favorable in several industries due to the nature and volume of data created. Examples include industrial sensors, autonomous vehicles, use cases augmented reality / virtual reality, connected healthcare devices, intelligent logistics, real-time surveillance, etc. ”, says Yehia.
While edge solutions can be leveraged to address some of the limitations of cloud computing, the debate should not be framed as a “cloud advantage,” but how they should work together. Both fall under the broader umbrella of occupation
a hybrid approach that best suits business needs, Yehia says.
“The question has more to do with what IT workloads need to be placed where and why. If we assume that all workloads are considered placed in the cloud, then Edge can enter as a competitor. But it was never like that
the deployment of the cloud and the edge should be seen as complementary rather than opposing ”.
Edge will not replace cloud-based applications; will be placed next to it, as a necessary complement to allow organizations to get the most out of their applications and data, agrees Vuleware’s Baguley.
IT environments are increasingly decentralized and organizations need to look ahead to identify their unique needs to develop a robust hybrid approach that includes a mix of cloud, lead, and core.