Why edge computing hasn’t taken off – yet

Companies are leveraging the edge for its ability to bridge the gap between data and devices.
20 April 2020

Data are invaluable assets of organizations. Source: Pexels

  • The public cloud services market is expected to hit US$266.4 billion this year
  • Edge computing is flexing its capabilities as a new addition
  • Architectural inefficiency impedes the full potentials of the edge to be accessed

“At this point, cloud adoption is mainstream,” noted Gartner research VP Sig Nag on the forecast that public cloud services will increase 17 percent this year, adding close to another US$30 billion on 2019.

“Building, implementing, and maturing cloud strategies will continue to be a top priority for years to come,” Nag said. 

While cloud-based applications are taking off, the technology continues to advance, and edge computing is regarded as a key area of increased spend and growth in the coming years. 

If cloud computing is all about scalability, distributing resources through flexible infrastructure at multiple locations, edge computing is all about proximity, bringing the cloud and data servers closer. It has emerged as a solution to tackle one of the core challenges of cloud – last-mile latency. 

By serving as a bridge between computing resources and users or devices that require immediate access, edge computing is purposed to minimize network latency and turbocharge the overall experience. The information these devices collect doesn’t have to travel nearly as far as it would under traditional cloud architecture. 

It can also be more secure, since there is less data in transit at risk of being intercepted. 

Edge computing covers a wide scope of technologies, from remote sensor systems, circulated information stockpiling and augmented reality – it will also be crucial within the infrastructure of autonomous vehicles and smart city technology, among a hoard of other use cases across industries.

Challenges pushed to the edge 

While edge computing’s main appeal is data transfer speed, the underlying infrastructure is still at a nascent stage and may inhibit the full potential of the edge to be unleashed.

Edge servers are predicted to replace some elements of cloud or data center-based delivery; however, this is not expected to carry out in full scale due to the premature network path from hosts to servers. In other words, edge technologies bring in an additional complexity that requires the remodeling of the cloud ecosystem and segments of internet infrastructure to accommodate its role.

Architecture remains a hindrance to the mass adoption of edge computing. The readiness of an infrastructure platform to match the demands and performance of the edge is a concern for the full-scale deployment of the edge.

Experts and insiders say that by acknowledging the complexity of edge computing in terms of infrastructure, it will be a step forward in its mass adoption. 

Moreover, the stated awareness prevents an ‘architectural lockdown’ that deters data center networks from expanding and evolving to accommodate next-gen technologies. 

Even so, the architectural aspect does not prevent tech-driven companies from adopting and experimenting with the immense capabilities of the edge. In January, retail giant Walmart laid out plans to strategically transform its 3,500 supercenters to edge computing tech hubs. 

The approach could see Walmart renting out its facilities and services, enabling connected services and infrastructure across the US in the years to come. “The supercenter footprint and positioning gives us a great opportunity to expand services […]” stated Walmart chief Doug McMillon. 

In short, no amount of simulations and laboratory tests can identify all the challenges in operationalizing the edge at scale – like many technologies, it will take some leaps of faith by keen adopters to uncover and address where the real pain-points lie.