You don’t have to look very far to see pronouncements that cloud computing is giving way to the next big thing in computing: the edge. Venture Capital firms proclaim that cloud computing is at its end, and that computing will “move to the edge.”
Edge Computing is such a thing that Gartner puts it as entering the “Peak of Inflated Expectations”:
When Gartner puts a technology at the Peak of Inflated Expectations, you know it’s uber-hot!
Of course, there’s no doubt that we are entering an age of smart devices — physical things that will operate better or be more useful when they are supplemented with computing power. My favorite example of this trend: the Withings scale.
The Withings scale tells you your weight, and even your body fat percentage. And naturally it has an app that tracks these metrics over time. But you can also connect it to your social media accounts so it will share your numbers with the world. Sounds terrible, right? But Penn Jillette (yes, that Penn) credits the Withings scale with saving his life — and specifically the encouragement he got from his fans and friends while trying to lose weight.
The Withings scale points out an interesting aspect of IoT: it’s impossible to predict how smart devices will be used. Sure, some will be oil pumps way out in the North Sea, throwing off hundreds of gigabytes of data each day. But some will sit on your bathroom floor and transfer less than a kilobyte of data every morning. So what is the role of edge devices in the future of computing? Will IoT mean the death of the cloud?
Unfortunately, there’s not a lot of real guidance out in the industry. OK, edge computing is hot. But beyond that? Most of the pronouncements in the industry hew to what I call the “anecdote, anecdote, vapid truism, talking points” mode.
The unstated implication is that a need for real computing done at the device site is looming and will be huge, so there’s a rosy future for those who deliver computing functionality located in situ.
But how true is that? How much real demand for edge computing will truly exist?
It seems to me that, boiled down, the anecdotes that get bruited about regarding the inevitability of edge computing stem from two things:
Absent these constraints, the best topology is a dumb-as-possible device with most of the application smarts residing in the cloud.
The reason distributed computing is always succeeded by centralization is efficiency and cost optimization. Put computers out in the field is great. Until it isn’t. Because they’re hard to manage. To patch. To keep up and running. And when they go from being a convenience to mission-critical, all of a sudden the capabilities that you get from centralized installation and management become paramount.
So, how will this edge/cloud showdown play out? What percentages of total computing power will reside in the field and how will it be delivered?
I wanted to move my understanding beyond the “anecdote, anecdote” stage and into something more structured.
Unfortunately, I couldn’t find any analytical framework addressing deployment scenarios and likely percentages of processing placement. So I decided to spend some time thinking about the relevant factors and put together an analytical framework:
There are two reasons to place computing near IoT devices: a requirement for real-time processing that precludes using remote computing, or network constraints that hinder communication between the device and remote computing, e.g., too much data to transmit conveniently, or need for low-latency application response (which might be a case of the need for real-time processing).
As you can see from the chart, there are two quadrants in which the application placement decision is clear-cut:
However, in two quadrants, the placement of computing is not so clear-cut:
The question is what percentage of all IoT applications will reside in each quadrant? To get a better sense of this, I sent a query out to an emerging tech LinkedIn group I participate in, asking for examples of IoT that required near-in, real-time computing. I got a couple of examples. One person brought up machine tools that throw off data and needed on-machine computing to perform analytics for wear, efficiency, etc. I asked how these analytics are performed today and was told that someone downloads information into a spreadsheet every couple of weeks and looks it over when they have time. Not really that urgent, then.
That was the problem. When I asked for examples of devices that had to have local computing, I got a couple of examples and then crickets.
I have to say, my intuition is that edge computing is not going to supplant centralized clouds — specifically the giant computing aggregations of AMG — except in one narrow sense that most would not consider evidence of true replacement.
My SWAG about the proportions of use case placement in the four quadrants would be:
If you have insight about these proportions, I’d love to hear about it. No anecdotes, please.
While I was preparing this post, I was contacted by Microsoft, which wanted to talk about its new Azure IoT Edge offering, which can migrate functionality from Azure down to a computing environment on-site. Microsoft discussed a couple of case studies, which are very interesting: Schneider and Sandvik. The Sandvik application is partitioned between the device (a metal cutting machine) and Azure. The device-deployed code’s purpose is to respond in real-time to a reading that implies a damaging failure condition; the remainder of the application runs in Azure. In other words, this application is an example of the upper left quadrant of the deployment model. I encourage you to watch the video, it’s really interesting.
And that’s where the “edge will kill the cloud” mantra falls apart.
Most edge computing will end up as specialized appliances, not general computing boxes: installed locally as part of the device installation, but remotely developed, installed, and configured, and maintained centrally by the manufacturer. I expect that most of these type deployments will be delivered by a manufacturer as a user hands-off device. This is what I meant when I stated above that edge computing is not going to supplant AMG, except in a narrow sense — which is to say, there may be huge amounts of computer processing going on in edge devices, but it will be specialized, single-purpose code delivered as a black box. And where will the manufacturer run the centralized application, data aggregation, and operations management? In the cloud. So edge computing is not a panacea for most IT vendors, or for most IT organizations either, for that matter.
There’s no doubt in my mind that edge computing is going to be an amazing trend in our economy and society. Physical things getting smarter will improve our lives in many dimensions.
But let’s not kid ourselves about what IoT means for the future of the computing industry.