A new trend is emerging that could disrupt cloud computing, or radically reframe how it operates at the margin.
Welcome to edge computing.
In the cloud, processing, storage and data analysis happens in huge centralised data centres. The advantages are difficult to refute. Economies of scale lower costs, R&D spend shifts from the end user to the vendors, and many important tasks — such as keeping IT environments robust and secure — are automated to a degree that the average IT manager can only dream about.
But there is a problem. With the advent of new devices and the ever-extending mesh of the Internet of Things, processing is increasingly required at the periphery — literally at the edge.
With the advent of new devices and the ever-extending mesh of the Internet of Things, processing is increasingly required at the periphery — literally at the edge.
This shift is starting to expose some of the limitations of cloud computing, which range from network latency to the speed of light.
In an article somewhat antagonistically headlined “The Cloud will eat the web”, Vice President and Distinguished Analyst with Gartner Research Thomas Bittman says Edge Computing is driven by several overlapping trends:
- Cloud computing — centralising IT for massive economies of scale and agile provisioning, volatility and growth;
- The Internet of Things (IoT) — things are becoming connected and sending reams of data;
- Machine learning — taking all of that data and improving processing and predictions;
- Augmented and Mixed Reality (along with Virtual Reality) — people can interact with other people and things both in physical and virtual worlds; and
- Digital Business and the Digital World — connections between things and people are pushing us to more and more real-time interactions and decisions.
Cloud computing made digital transformation possible — but not necessarily in the most obvious way. Instead, it made enterprise-grade computing available to entrepreneurs at peppercorn rates, and fueled a huge rush in innovation.
Edge computing, driven by the need to push services to the point of consumption in the most efficient manner, looks set to unleash its own wave of change.
Marketers need to start imagining the implications of a world where processing — and more importantly data — are influenced at the point of consumption.
Cloud computing made digital transformation possible — but not necessarily in the most obvious way. Instead, it made enterprise-grade computing available to entrepreneurs at peppercorn rates and fueled a huge rush in innovation.
Of course, not everyone agrees that edge computing and cloud computing are in conflict. Writing in Which-50 in March, The Register’s Australian editor Simon Sharwood argues that, far from replacing the cloud, edge computing might provide it with some interesting extensions.
Edge computing will save on cloud costs by eliminating the need to shift heavy content types like video and images over the Internet and back to the cloud. It also allows for the pre-processing of data, which is especially important where instant data analysis is required — often in machines that are themselves not stationary, such as predictive analytics on a jet. And, writes Sharwood, it will add to the efficiency of content replication.
“To understand why, consider that today the likes of Netflix work with companies called Content Distribution Networks (CDNs) to pre-position — to put lots of copies of video all around the world so that when a new series debuts and the world starts binge-watching, you and I only need to reach a nearby server instead of schlepping across the world on a submarine cable.”
Most IT managers have yet to start imagining this shift to the margins, caught up as they are in the day-to-day operations of the business. But marketers might like to ask themselves a simple question: do your customers come into your data centre to buy your products, or are they living day-to-day on the edge?
Read more in ADMA's Future of Marketing: Technology Edition