Processing Moves To The Edge

Definitions vary by market and by vendor, but an explosion of data requires more processing to be done locally.

popularity

Edge computing is evolving from a relatively obscure concept into an increasingly complex component of a distributed computing architecture, in which processing is being shifted toward end devices and satellite data facilities and away from the cloud.

Edge computing has gained attention in two main areas. One is the industrial IoT, where it serves as a do-it-yourself infrastructure for on-site data centers. The second involves autonomous vehicles, where there is simply not enough time to ask the cloud for solutions.

But ask two people to describe it and you are likely to get two very different answers. On one hand, it is understood well enough that it can be used in satellite IIoT data centers and in machine learning-enabled iPhones. On the other, most of those designing it can’t say what it looks like.

Much of the confusion stems from the fact that edge computing is not a technology. It’s more of a technological coping mechanism. It represents a series of efforts to deal with the exponential growth of data from billions of endpoint devices by digesting at least some of that data wherever it is created. That requires building massive compute performance into everything from sensors to smart phones, and all of this has to happen within an even tighter power budget.

“We are moving to an intelligent edge,” said , president and CEO of Cadence, said in a recent speech. “This is going to be a new era for semiconductors. We want data at our fingertips to be able to make decisions on the fly.”

This approach stands in stark contrast to the general consensus several years ago that simple sensors would be used to collect data from the physical world and processed in the cloud. The original concept failed to take into account that the amount of data being collected by sensors is growing too large to move around quickly. The best solution is to pre-process that data, because most of it is useless.

“The IoT represents an exponential increase in the number of devices in the world, and the amount of data generated by these devices could swamp the data center’s ability to process it,” according to Steven Woo, distinguished inventor and vice president of enterprise solutions technology at Rambus. “It’s likely you can do aggregation, filtering and some rudimentary processing, depending on what how complex your computations are.”

This is the growing responsibility of edge devices. But how the edge evolves, and how quickly, depends upon the readiness of end markets that will drive it. So while the edge began taking off last year in the IIoT, it is still on hold in the automotive space because it’s not clear at this point how quickly fully autonomous vehicles will begin to ramp up.

“If there isn’t an immediate production target, you might get away with something that’s a lot less advanced,” said Ty Garibay, CTO at ArterisIP. “You might be able to aggregate this kind of functionality into multiple smaller chips made by different companies. There will be LiDAR, radar, and possibly a sensor fusion hub, which may be an FPGA. And then you might need enough compute power for the car controller, which also may have to figure out which data to process and what to send back to the cloud. The question now is how you make it smart enough to send back the right data.”

What is the edge?
Many chipmakers and systems companies struggle with the variety of ways it is possible to shift computing to the edge. There are no demarcation lines between the many levels that may or may not be included in this distributed computing model.

“There is a lot of difference of opinion on the point of what the edge looks like,” according to Jeff Miller, product marketing manager at Mentor, a Siemens Business. “The cloud is where the really high-powered machine language or computational resources will continue to be, but bandwidth to get it there is expensive and shared spectrum is a finite resource. So just streaming all that data to the cloud from thousands of devices without some pre-processing at the edge is not practical.”

It doesn’t help that there are varied language and explanations offered by carriers, networking providers, integrators, datacenter OEMs and cloud providers—all of which are competing for what might be billions of dollars in additional sales in a market described by a term that doesn’t mean anything specific enough to package under a single brand name, according to Tom Hackenberg, principal analyst for embedded systems at IHSMarkit.

“Edge computing” is a common but non-specific term referring to the addition of computing resources anywhere close to the endpoint of an IT infrastructure. The definition has been narrowed colloquially to mean compute resources installed specifically to support IoT installations. “It’s a set of architectural strategies, not a product, not a technology,” Hackenberg said.

Even limiting the definition of edge to its function as the compute power for IoT installations doesn’t focus the picture much, according to Shane Rau, research vice president for computing semiconductors at IDC. “There is no one IoT. There are thousands, each in a different industry with a different level of acceptance and capability. It may not be possible to see what the edge looks like because it looks like the edge of everything.”

Still, there are benefits to getting this right. Gopal Rahgavan, CEO of startup Eta Compute, said that edge computing improves both privacy and security because it keeps data local. And it improves response time by eliminating the time it takes to send and receive data back from the cloud.

“You want to sense, infer, and act without going to the cloud, but you also want the ability to learn on the edge,” he said, noting that the cochlea in the ear already does this today, allowing it to identify speech in a noisy environment. The same happens with the retina in the eye, which can decipher images and movement before the brain can process those images.


Fig. 1: Edge computing platform. Source: NTT

Why the edge is getting so much attention
One of the initial drivers behind the edge computing model was the industrial IoT, where a desire to see projects succeed prompted industrial organizations to try to solve both the cost-efficiency and data-deluge problems on their own.

“In the industrial space there is a need for factory automation and intelligence at the edge, and the risk is comparatively smaller because it is possible to demonstrate value in accomplishing those things,” said Anush Mohandass, vice president of marketing and business development at NetSpeed Systems. “The IIoT will lead the charge to build out IoT infrastructure for very practical reasons.”

That, in turn, led to a push to keep compute resources near the physical plants. But the benefits go much deeper than just keeping IoT devices off the Internet, according to Rambus’ Woo. More processing power means greater ability to pre-process data to eliminate repetitions of the same temperature reading, for example, or render the data feed from hundreds of sensors as a single status report.

Apple’s announcement in 2017 that it would put machine learning accelerators into its top-end iPhone touched off a rush that Gartner predicts will see 80% of smartphones will be AI-equipped by 2022. Those will be powerful, latency-sensitive edge devices, but will focus on functions aimed at individual consumers – augmented reality and biometric authentication, for example, which will limit their impact in the short term, said IDC’s Rau.

The addition of ML capabilities into other consumer devices – and autonomous vehicles and other smart devices – is likely to create an ecosystem on which all kinds of powerful applications can be built, using edge data centers for support, said Mohandass.

“We saw in the mainframe era that having a central brain and everything else being dumb didn’t work,” he said. “There was a lot more computing power with PCs, even if they were limited. Now, with central cloud, hyperscale datacenters have a lot more power. Clients aren’t quite a dumb terminal, but they are not too smart. We’re heading for another inflection point where the edge devices, the clients, have the capacity to have a lot more intelligence. We’re not there yet, but it’s coming.”

Until then, the focus should be on developing ways to use that deluge of data from IoT devices to accomplish things that wouldn’t be possible otherwise, said Mentor’s Miller. “The core value of the IoT is in bringing together large data sets, not so much monitoring so you know immediately when there’s a leak in tank 36 out of 1000 tanks somewhere. The value is in identifying things that are about to fail or activate actuators in the field before a problem actually comes up.”

Other pieces of the puzzle
Much of the edge model is based on the way the human body processes information. A person’s hand will recoil from a hot stove, for example, before signals reach the brain. The brain then can explain what just happened and avoid such situations in the future.

That sounds simple enough in concept, but from a chip design standpoint this is difficult to achieve. “A lot of IoT devices actually present an interesting dilemma because they don’t need a lot of memory, but what they need is a very small power signature,” said Graham Allan, product marketing manager for memory interfaces at Synopsys. “That is a particular application that is not yet well served by the DRAM industry. It remains to be seen whether or not that market will be big enough to warrant having its own product, or whether it will continue to be served by the two generations of older LPDDR technology and you just have to live with what’s there.”

In some cases, there may be a middle step, as well. In 2015, Cisco proposed the idea of Fog computing, extending the reach of cloud-based applications to the edge using boxes that combined routing and Linux-based application servers to analyze sensor data using Cisco’s IOx operating system. Fog has its own open consortium and reference architecture for what it calls a “cloud-to-Thing continuum of services,” and NIST was interested enough to put out Fog guidelines. (The IEEE Standards Association announced in October it will use the OpenFog Reference architecture as the basis for its work on fog standards under IEEEP1934 Standards Working Group on Fog Computing and Networking Architecture Framework.)

This also is aimed at keeping the Internet from drowning in things. Initial plans for the IoT included building IoT control centers at or near the site of IoT installations, with enough compute resources to store the data flowing from devices, provide sub-second response to devices where it was needed, and boil masses of raw data down to statistical reports that could be digested easily by process. These principles were traditional best practices for embedded systems installed as endpoints near the edge of the organization’s IT infrastructure, but the scale and variety of functions involved turned the decision to add computing resources at the edge into edge computing. That has evolved still further into the “intelligent edge.”

Regardless of the moniker, edge computing appears to be icing on the cake for technology providers. For one thing, it won’t cannibalize public cloud spending. IDC predicts a 23% increase this year compared to last, and 21.9% annual growth until 2021. And it could only be helping sales of the IoT, a market in which IDC predicts spending will rise 15% in 2018 compared to 2017, to a total of $772 billion, $239 billion of which will go to modules, sensors, infrastructure and security. IoT will see annual growth of 14% per year and pass the $1 trillion mark in 2020, according to IDC.

Gartner predicts semiconductor revenue will rise 7.5% to $451 billion in 2018, far above the record $411 billion in 2017. And by 2021 51% of all devices connecting to the Internet will be IoT. Their chatter will rise from 2% of all global IP traffic during 2016 to 5% of all IP traffic, according to Cisco Systems (Cisco VNI Global IP Traffic Forecast).

Humans will interact with those devices an average of 4,800 times per day in 2025, helping to drive the volume of digital data created every year up by a factor of 10, from 16.1 zettabytes in 2016 to 163 zettabytes during 2025, according to IDC’s August, 2017 report Data Age 2025.

While reports from IDC and IHSMarkit show the cloud market continuing to grow, they have trouble showing the increasing dominance of edge computing, which may not exist in a formal sense. Moreover, it is difficult to define well enough for those who design the intelligence to make it happen.

IHSMarkit’s most recent estimate is that there were about 32 billion IoT devices online during 2017; there will be 40 billion by 2020, 50 billion by 2022 and 72.5 billion by 2025. “The IoT exists because microcontrollers and other controllers came down in price enough to make it feasible to connect a wider range of embedded device, but we didn’t have the infrastructure to support that,” Hackenberg said. “That is what edge computing addresses. Once a stronger infrastructure is in place, growth in the IoT explodes.”

That’s not bad for a concept that is still ill-defined. “Everyone gets very excited about the edge, but no one knows what it means,” according to Stephen Mellor, CTO of the Industrial Internet Consortium (IIC), a standards- and best-practices consortium that is heavily supported by Industrial Internet of Things providers. The group put out its own guide to IoT analytics and data issues last year. “You can do some controlled analysis and processing at the edge, but you still need the cloud for analytics on larger datasets that can help you decide on a plan of attack that you then execute closer to the edge.”


Fig. 2: Market impact of Edge, IoT growth. Source: Cisco Systems

Datacenters, Data Closets, Data Containers
Not surprisingly, there is some variability in what building blocks and configurations might work best as edge data centers. Edge data centers have to be more flexible and more focused on immediate response than traditional glass-house data centers. They also have to be able to combine many data streams into one usable base that can be acted upon quickly.

From a hardware perspective, however, the edge can be anything from a collection of servers and storage units house using a co-location agreement in a local cloud or data processing facility, to a hyperconverged data center-infrastructure module housed in a cryogenically cooled shipping container.

The scale of some IoT installations will force some organizations to build full-scale data centers even at the edge, or use a piece of one owned by a service provider, according Michael Howard, executive director of research and analysis for carrier networks at IHSMarkit. Some carriers are interested in accelerating converting the 17,000 or so telco wiring centers in almost every community in the U.S. to offer richer IT services, including edge services. Central Office Rearchitected as a Datacenter (CORD) programs have converted only a few facilities, however, and most will see more use in the conversion to 5G than in edge computing, Howard said.

Other options include the smaller, more modular and more easily scalable products that make it easier to assemble resources to fit the size and function of the devices they support, Hackenberg said. That could mean hyper-converged datacenter solutions like Cisco’s UCS, or pre-packaged 3KVA to 8KVA DCIM-compliant Micro Data Centers from Schneider Electric, HPE and others. There also are VM-based self-contained server/application “cloudlets” described by Mahadev Satyanarayanan of Carnegie Mellon University, and the nascent Open Edge Computing consortium.

—Ed Sperling contributed to this story

Related Stories
How To Secure The Network Edge
The risk of breaches is growing, and so is the potential damage.
Cloud Computing Chips Changing
As cloud services adoption soars, datacenter chip requirements are evolving.



Leave a Reply


(Note: This name will be displayed publicly)