Edge Computing and the Death of Cloud: Nonsense

What Amazon Prime Day Tells Us About Cloud Computing
September 27, 2017
AMG Q317 Cloud Financials Foretell Disruption
November 3, 2017
Show all

You don’t have to look very far to see pronouncements that cloud computing is giving way to the next big thing in computing: the edge. Venture Capital firms proclaim that cloud computing is at its end, and that computing will “move to the edge.”

Edge Computing is such a thing that Gartner puts it as entering the “Peak of Inflated Expectations”:

Don't miss a single blog!

Don't depend on Twitter, Facebook, or Linkedin notifications. Sign up and get the latest blogs and videos delivered to your inbox asap!

I will never give away, trade or sell your email address. You can unsubscribe at any time.

When Gartner puts a technology at the Peak of Inflated Expectations, you know it’s uber-hot!

Of course, there’s no doubt that we are entering an age of smart devices — physical things that will operate better or be more useful when they are supplemented with computing power. My favorite example of this trend: the Withings scale.

 

                                The Withings scale — A life saver

 

 

The Withings scale tells you your weight, and even your body fat percentage. And naturally it has an app that tracks these metrics over time. But you can also connect it to your social media accounts so it will share your numbers with the world. Sounds terrible, right? But Penn Jillette (yes, that Penn) credits the Withings scale with saving his life — and specifically the encouragement he got from his fans and friends while trying to lose weight.

The Withings scale points out an interesting aspect of IoT: it’s impossible to predict how smart devices will be used. Sure, some will be oil pumps way out in the North Sea, throwing off hundreds of gigabytes of data each day. But some will sit on your bathroom floor and transfer less than a kilobyte of data every morning. So what is the role of edge devices in the future of computing? Will IoT mean the death of the cloud?

Unfortunately, there’s not a lot of real guidance out in the industry. OK, edge computing is hot. But beyond that? Most of the pronouncements in the industry hew to what I call the “anecdote, anecdote, vapid truism, talking points” mode.

An example:

  • “Well, some IoT applications can’t accept round trip latency to a cloud data center. They have to respond in real time. For instance, self-driving cars have to react in milliseconds.” (Anecdote One).
  • “There are some places that don’t have any network connectivity, so they can’t connect to a cloud data center. Like oil derricks out in the ocean.” (Anecdote Two).
  • “So, there will be lots of different use cases with differing requirements.” (Undeniable and vapid truism). Alternatively: “This move is just part of the always-happening shift between centralized and distributed computing,”
  • “So that means there will be lots of demand for our (non-cloud) gizmo.” (Vendor talking points, adjust for individual vendor).

The unstated implication is that a need for real computing done at the device site is looming and will be huge, so there’s a rosy future for those who deliver computing functionality located in situ.

But how true is that? How much real demand for edge computing will truly exist?

It seems to me that, boiled down, the anecdotes that get bruited about regarding the inevitability of edge computing stem from two things:

  • Edge devices that require real-time responsiveness
  • Edge applications that have connectivity constraints, due to poor connectivity or a use case in which low latency is required

Absent these constraints, the best topology is a dumb-as-possible device with most of the application smarts residing in the cloud.

Why?

The reason distributed computing is always succeeded by centralization is efficiency and cost optimization. Put computers out in the field is great. Until it isn’t. Because they’re hard to manage. To patch. To keep up and running. And when they go from being a convenience to mission-critical, all of a sudden the capabilities that you get from centralized installation and management become paramount.

So, how will this edge/cloud showdown play out? What percentages of total computing power will reside in the field and how will it be delivered?

I wanted to move my understanding beyond the “anecdote, anecdote” stage and into something more structured.

Unfortunately, I couldn’t find any analytical framework addressing deployment scenarios and likely percentages of processing placement. So I decided to spend some time thinking about the relevant factors and put together an analytical framework:

Edge Computing Deployment Models

There are two reasons to place computing near IoT devices: a requirement for real-time processing that precludes using remote computing, or network constraints that hinder communication between the device and remote computing, e.g., too much data to transmit conveniently, or need for low-latency application response (which might be a case of the need for real-time processing).

As you can see from the chart, there are two quadrants in which the application placement decision is clear-cut:

  • When network constraints exist and the use case dictates real-time responsiveness, computing must be placed near the device. Autonomous vehicles are the quintessential example of this use case
  • When no network constraints exist, and the device operates in non-real time mode, it makes sense to place the application in a cloud environment with the device acting as a dumb sensor/actuator

However, in two quadrants, the placement of computing is not so clear-cut:

  • When there are network constraints, some computing must be placed at the device location, but as much computing as possible should be placed in the cloud (due to the efficiency and cost optimization factors mentioned above)
  • When there are real-time requirements but no connectivity or latency constraints, computing must be placed at the device location sufficient to deliver application functionality — but needing some real-time capabilities is not the same as the entire application needing to operate in real-time mode (see the Sandvik discussion below).

The question is what percentage of all IoT applications will reside in each quadrant? To get a better sense of this, I sent a query out to an emerging tech LinkedIn group I participate in, asking for examples of IoT that required near-in, real-time computing. I got a couple of examples. One person brought up machine tools that throw off data and needed on-machine computing to perform analytics for wear, efficiency, etc. I asked how these analytics are performed today and was told that someone downloads information into a spreadsheet every couple of weeks and looks it over when they have time. Not really that urgent, then.

That was the problem. When I asked for examples of devices that had to have local computing, I got a couple of examples and then crickets.

I have to say, my intuition is that edge computing is not going to supplant centralized clouds — specifically the giant computing aggregations of AMG — except in one narrow sense that most would not consider evidence of true replacement.

My SWAG about the proportions of use case placement in the four quadrants would be:

  • Partial Edge: 15%
  • Full Edge: 3-5%
  • Partial Cloud: 5-10%
  • Full Cloud: 72%-77%

If you have insight about these proportions, I’d love to hear about it. No anecdotes, please.

While I was preparing this post, I was contacted by Microsoft, which wanted to talk about its new Azure IoT Edge offering, which can migrate functionality from Azure down to a computing environment on-site. Microsoft discussed a couple of case studies, which are very interesting: Schneider and Sandvik. The Sandvik application is partitioned between the device (a metal cutting machine) and Azure. The device-deployed code’s purpose is to respond in real-time to a reading that implies a damaging failure condition; the remainder of the application runs in Azure. In other words, this application is an example of the upper left quadrant of the deployment model. I encourage you to watch the video, it’s really interesting.

Another interesting thing about the Sandvik device-deployed code? It runs on an SOC system, with (roughly) the processing power of a Raspberry Pi.

And that’s where the “edge will kill the cloud” mantra falls apart.

Most edge computing will end up as specialized appliances, not general computing boxes: installed locally as part of the device installation, but remotely developed, installed, and configured, and maintained centrally by the manufacturer. I expect that most of these type deployments will be delivered by a manufacturer as a user hands-off device. This is what I meant when I stated above that edge computing is not going to supplant AMG, except in a narrow sense — which is to say, there may be huge amounts of computer processing going on in edge devices, but it will be specialized, single-purpose code delivered as a black box. And where will the manufacturer run the centralized application, data aggregation, and operations management? In the cloud. So edge computing is not a panacea for most IT vendors, or for most IT organizations either, for that matter.

There’s no doubt in my mind that edge computing is going to be an amazing trend in our economy and society. Physical things getting smarter will improve our lives in many dimensions.

But let’s not kid ourselves about what IoT means for the future of the computing industry.

 

4 Comments

  1. Hi Bernard,

    Absolutely spot on. The hype-cycle never ceases to amaze me. What you pointed out as a vapid truism – ““This move is just part of the always-happening shift between centralized and distributed computing,” – nevertheless has more than a grain of truth associated with it. However, that truth is not so much that there is a shift between the two, but that both co-exist at the same time, albeit in differing degrees of adoption. And that degree of adoption swings like a pendulum depending on where computing technology is vis-à-vis network IO technology, although there are other factors such as security, management, and control that play a role.

    I don’t have anything but anecdotes to share at this point, so I’ll lay off on general prognostications, but hope to share some thoughts on the analytical framework you came up with – the quadrant you came up with captured my initial thoughts perfectly. And as we get more exposure to real-world use cases, I’ll be sure to share the larger learning from this. The ones we have right now cannot be meaningfully extrapolated in any way.

    Thanks for putting this together. It hit a nerve.

    Madhu

  2. Tim Coote says:

    Hi Bernard
    Strongly agree. And I’ve built real IoT systems. I think, if anything, you’re underplaying the value of centralised control as it enables much better control of fast iteration while trying to understand where the value is.

    Latency/connectedness can be issues, but as you note, very rarely (security is another domain): for the most part, the latency and connectedness over the last few metres dominates the metric. In fact, I’d go as far as saying that approaches such as mesh networking aren’t good for most applications as they introduce unmeasured latency and lost connectivity.

    Much of the value of IoT comes from combination of sensors/actuators, which is an integration problem. This mitigates against s/w complexity at the edge, and creates trade-offs for firewall security vs responsiveness (assuming devices are in separate administration/ownership domains).

    Finally, don’t underestimate the power of Pi style computers: they’re much faster and computationally more capable than the boxes we used to run MS Word on Windows 4 on (I checked).

    • Bernard Golden says:

      Hi Tim: Many thanks for your comment and your kind words. Glad to see someone with experience chiming in.

      With respect to your comment on Pi, I will take it on board. I wasn’t so much denigrating the capability of Pi-style computers as much as noting that this form factor is plenty capable of being the execution environment for many IoT edge environments, which negates the hopes of the large system vendors that think IoT will drive demand for servers.

      Thanks again for your comment.

Leave a Reply

Your email address will not be published. Required fields are marked *

Confused about smart contracts? Download this free white paper and get smart fast!

Enter your email for immediate access

Close this popup

Companies are excited about the potential for migrating business processes to blockchain smart contracts. They promise faster process execution, improved security, and lower costs. Unfortunately, smart contracts aren't well understood and many people are confused about how to launch a smart contract initiative.

Download this free white paper and learn about these four critical smart contract elements:

  • The four main benefits smart contracts provide
  • The key challenges organizations encounter with smart contracts
  • The best use cases for your early smart contract efforts
  • How smart contracts will help your company standardize its business processes