Upgrade Magazine

OPINIONS

Edge Computing driven transformation of data, management and applications

It is very obvious that you need to bring intelligence closer to [IoT] devices, and for that you must have a kind of distributed intelligence and that is going to reload the opportunities for digital transformation.

By Lionel Snell
Editor, NetEvents

Big name companies are investing in Edge Computing developments. Bruce Davie, VMware’s VP and CTO, Asia Pacific and Japan, ranked Edge Computing along with the rise of mobile devices, cloud and AI in his list of “four super powers of technology”. According to Fujitsu CTO Joseph Reger: “It is very obvious that you need to bring intelligence closer to [IoT] devices, and for that you must have a kind of distributed intelligence and that is going to reload the opportunities for digital transformation.” With reference to Gartner saying that by 2022, some 75 percent of enterprise-generated data would be created and processed outside the traditional, centralised data centre or cloud, Hewlett Packard Enterprise (HPE) committed some US$4 billion investment in intelligent edge technologies and services by 2021.

Edging ahead

Edge Computing is a hot topic, and still on the ascendant. It is a vital part of the 5G project, but what exactly does it mean? Ask industry analysts and you might get a range of answers – as Gerry Christensen, Founder and CEO of Mind Commerce found out:

“I had a conversation recently with another analyst.  I think we were about 15 minutes into the discussion before I realised that she was thinking edge compute as in a smartphone or maybe a wearable or may be some kind of customer prem device, and I was thinking, an extension of core cloud computing.”

We will go with Christensen on that one. As an extension of core cloud computing, Edge Computing becomes more fundamental and significant. It is about the distinction between data being processed centrally to provide information, and data being processed at the periphery to trigger action – as in automated systems and IoT. For Sreelakshmi Sarva, Head of Product Strategy at NetFoundry, it’s just a form of cloud computing: “where, instead of hosting the workload in a specific CSP’s infrastructure, you’re hosting it closer to the users… mainly because of applications which require real-time data processing or low-latency type requirements.”  

In a cloud-connected world where every device can be an endpoint, perhaps “the edge” is best seen not as a destination but as a direction: away from the centre. From a that perspective, even regional data centers could be Edge Computing or, as Christensen observed: “I was in New York City recently and I saw some 5G remote radio heads and Battery Backup Units.  Right next to it was a bunch of boxes.  Pretty sure it was Edge Computing.  Not positive but it was certainly a good location for it.”

Why now?

In these terms Edge Computing is not new, it is already used for 4G, but what is different for 5G is that it becomes essential as a means to exploit 5G’s latency benefits – without having to send packets right back to the core for processing. This is critically important for a new wave of “ultra-reliable, low latency communication” (URLLC) applications such as self-driving vehicles, smart city, virtual reality and on-line gaming. 

Kevin Deierling, VP Marketing, Mellanox Technologies has an interesting analogy of the eye’s retina as an edge computer: “It’s not a digital camera that’s sending individual pixels to your brain and all of the visual systems is done in the cortex of your brain.  It doesn’t work that way.  If it did there would be too much data going through the optical nerve to your brain.  Your eye is actually doing pre-processing.  It detects motion.  It’s directional-specific.  Up, down, left, right, certain neurons only fire if they see those directions.”

Another driver for Edge Computing is the massive rise of machine-type communications (MTC): in other words, the Internet of Things (IoT). 5G is hugely scalable to accommodate thousands of IoT devices, and the resulting traffic would quickly overload core networks – unless the bulk of the data is pre-processed at the edge and only essential signals are sent back to headquarters. As Deierling points out, when an electric car recharges it has accumulated weeks’ worth of raw driving data and, instead of up loading all that, it effectively compresses the data by only extracting and uploading key findings to the manufacturer. (According to Wards Auto, a single autonomous test vehicle produces about 30 TB per day, which is 3,000 times Twitter’s daily data) .

This enhanced responsiveness is the key to real-time business decisions – something needed for driverless vehicles, but opening up exciting possibilities for automating any number of other applications. Rick Calle, Head of Business Development at Microsoft AI Research, gives the example of an Israeli company FarmBeats that offers an early warning service to farmers about crop diseases. They send drones filming vast areas of farm land, accumulating massive amounts of visual data that is processed at the base station for signs of diseased crops. Only those findings need be transmitted back to the farmer.

According to the Industrial Internet Consortium (IIC) almost every use case and every connected device they focus on does need some sort of compute capability at the edge. However the definition of Edge Computing as “cloud computing systems that perform data processing at the edge of the network, near the source of the data” gives little idea of “the immense power and remarkable capabilities that edge computing applications and architectures can provide to solve industrial internet users’ toughest challenges.” According to Verizon 5G and Edge computing are “making real time reality”.

Managing an even less tangible cloud

The IIC says there is an initial need to identify: where the edge is; its defining characteristics; the key drivers for implementing Edge Computing; and why compute capabilities should be deployed at the edge in Industrial Internet of Things (IIoT) systems. For Gerry Christensen there are three important questions: “Who owns the edge, who manages the edge, and who uses the edge?”

The scary answer to the first question is that we all own the edge. One interviewee, who had better remain nameless, explains the problem: “I’ve got cameras in my house, I have a music server, I plug my drone into my network, and I know a lot about security and I know all the things I should do but I don’t do. I don’t secure my home network… We can’t leave it up to people like me, even. I’m a techie, and I still don’t do it. So, it [management] needs to be automated.”

Sreelakshmi Sarva emphasises the edge’s risks for security and privacy, and the need for standard APIs (Application Programming Interface): “On the API and automation front, having a uniform abstraction of where your workload resides – whether it’s on-prem, datacentre or core cloud provider, or edge compute – automation is the key.  Having a way to manage these workloads through an API or an automated way of deploying – not just for provisioning but for ongoing maintenance – is the key.” He goes on to say: “The lessons learned from the cloud compute paradigm can definitely be applied and leveraged for edge compute, when it comes to API and manageability and automation”. For maximum security  he recommends an air-gapped environment both at network and application level –  should the infrastructure be compromised for whatever reason there should be no risk of lateral movement gaining access to the app.

According to Kevin Deierling: “The devices at the edge have to maintain their own keys, have key management integral to them and basically be able to authenticate everything, do secure firmware updates.  All that has to be built in.” His company address this issue with an intelligent I/O processing unit that has the trust credentials built into the hardware. The problem with APIs, and the limits this puts on the IoT market, lies in fragmentation: “If everybody goes off and invents their own APIs – companies like ours will not be able to drive our innovation.  We’re looking to folks like Microsoft to define an API.  We really need these APIs to become standardised so we can sell to a broad range of IOT applications using standard APIs.”

But how can one ask for and enact a firmware update securely – asks Rick Calle. “Microsoft is in that space, we have something called IOT Hub, or Azure IOT. I won’t call it Windows on IOT, but you can think of it as the operating system that’s above the hardware layer. That manages it, if you will, but your app really is the thing and your deployment really has to manage when to update, where to put the data, how secure and do I do AI on the edge? That’s really my thing, how do I get AI models from the cloud and have them running on that Edge device? That’s another challenge, how do I deploy an update – update models and then get data back that I can then retrain on privately? So, who owns it? Usually it’s whomever deployed that system”.

Any conclusion?

For Sreelakshmi Sarva the edge must be seen more as an environment than a location: “There’s an infrastructure for wider software layers, silicon providers, an orchestration layer, and at the end of the day, the applications that define the Edge. So it’s more of an ecosystem, at least that’s what we are seeing today in the market, and I think more and more they will converge to a certain set of providers managing and owning the platform”.

Edge Computing is hot indeed, and we are only at the beginning, according to Gerry Christensen’s research for Mind Commerce: “It’s many things to many people. There are actually many hands touching it, and it’s going to be probably an order of magnitude more complex than core computing because of that. Especially when we start having applications, like virtual reality, that we can only just imagine right now.”

The full transcript of the NetEvents/Mind Commerce conference session is available at: https://www.netevents.org/wp-content/uploads/2018/06/Debate-VIII-Edge-Computing-Mind-Commerce-draft.pdf

To Top