News Center - The IoT is pushing data centers to the edge - Delta EMEA

20/09/2018

The IoT is pushing data centers to the edge

Article writer:
Mikael Berggren, Director of Business Development, Data Center Solutions, ICTBG, Delta Electronics (Netherlands)
 



Off in the distance, the IT world has spotted a looming iceberg of data. The rapid growth of connected devices we call the Internet of Things (IoT) has set us on a collision course. It’s time for evasive action, and the edge may be our best chance to steer clear of an impending data overload.

Today alone, the world will generate an additional 2.5 exabytes of data. That’s a lot, but it’s nothing compared to where we’re headed. Forecasts predict that, by 2025, vehicles will transfer 10 exabytes of data to and from the cloud each month. And that’s just one application among many, not to mention the ones we haven’t thought up yet.

So how can we avoid exposing data centers to the largely hidden iceberg of data that is the Internet of Things? Consider that connections to the cloud will not always be fast enough and the capacity to deal with IoT-scale data isn’t there at the backend even if they were. It quickly becomes clear that, in the future, we’ll be forced to pick and choose which data actually make it all the way to the cloud. As Helder Antunes at Cisco has remarked, “In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that’s just not realistic.”


It all started with CDNs

If we are willing to expand the definition of edge computing a bit beyond the technologies currently making headlines, it’s really nothing new. Content delivery networks (CDNs) such as Akamai’s have been around since the 90s, for instance. A CDN is a way of distributing static content across multiple sites to get that information closer to where it is actually being used. This has proven to be a very efficient model for delivering things like streaming video. Edge computing builds on that approach. But instead of the individual nodes near the edge of the network merely storing and serving content, they also receive and process data.

After the advent of cloud computing, the existence of such intelligent nodes between users and the cloud data center was an important advance. This mashup has made possible a scenario in which we can now combine the heightened data protection and lower latency of on-premise solutions with the flexibility and accessibility of the cloud.


What’s changed

A number of new applications are placing new demands on the way we move and process data. Autonomous vehicles, smart homes, and smart manufacturing all involve connected devices that generate useful data — tons of it. But raw data from sensors alone won’t help you avoid a traffic jam, perform preventive maintenance on factory equipment, or get a text message that your daughter has arrived safely home. All of these things require some type of processing, be that real-time analytics, machine learning, or some other type of artificial intelligence.

Right now, most of that processing is happening in big data centers. But as the demand for computer resources balloons, this is no longer a viable model. These use cases will be generating more data and requiring faster response times than the cloud can handle.

Currently, only 10% of all data are processed outside the cloud or data center. Gartner predicts that by 2022, a whopping 50% of all data will be processed somewhere else (read: on the edge). Another development that will help define the edge in the future is the ongoing transition to 5G mobile networks in many countries including the US. Carriers are deploying or actively considering adding edge micro data centers in close proximity to their new mobile 5G base stations as they upgrade networks.


Lots of rough edges

There are a raft of new technologies inserting themselves into the space between the device and the cloud, all of them jockeying for position. You’ve got Microsoft Azure’s IoT Edge, AWS Greengrass, AWS Lambda, and the Akraino Edge Stack that Intel is getting involved with, just to name a few.


Horizontal or vertical?

Among these up-and-coming edge technologies, there are both vertical solutions that cover everything from IoT devices and edge servers all the way to the cloud, as well as horizontal approaches that aim to add edge computing functionality to a broad range of devices, such as with apps running inside containers or a hypervisor on various devices within a given infrastructure.


Risks and rewards

By placing storage and computing resources closer to the source of the data, both latency and the amount of cloud bandwidth required are drastically reduced. But, what about security? That’s something that will vary greatly across applications. Some experts believe that by limiting the amount of data travelling across the open internet, edge computing enhances security. And for companies with sensitive data that must not leave the premises, this is a big benefit. On the other hand, the addition of countless IoT devices and a separate level of infrastructure including edge servers gives malicious entities a larger attack surface. Because each additional endpoint you add could potentially be compromised and provide a pathway to infiltrate core networks.


Who will have the edge?

Another issue with moving infrastructure from the cloud to the edge is the question of ownership, operation, and maintenance. Companies have taken advantage of cloud computing in part because of its potential to drastically lighten their administrative and maintenance burdens. If computing infrastructure comes back home to the edge, who will own and operate it?

These questions are difficult to answer now, because there is little clarity about which technologies will prevail in the end. One thing is certain: whoever is in charge of the edge data center of the future and however large or small that facility may turn out to be, efficiency will be a crucial factor. And it’s not easy to beat large data centers for computing efficiency—or reliability for that matter. I believe that, for these reasons, there is no one-stop solution that will work for every client’s edge computing use case. Every edge facility will need to be customized for the respective application. But whatever the size, tomorrow’s data center infrastructure—especially power and cooling—will be more important as data gets pushed to the edge.
 

News Source:UPS & Data Center Infrastructure