IIoT Edge Is A Moving Target

Industrial Internet Consortium defines use scenarios, not standards, of IIoT edge computing.


Edge computing happens in an industrial IoT (IIoT) system wherever it needs to happen. The business needs for an IIoT system—or one layer of that system—will determine when and where the computing happens.

This conclusion, from an introductory report written by the IoT testing organization the Industrial Internet Consortium (IIC), helps explain why no one consistently can say what edge computing should look like, even though it has been at the top of technology agendas for more than a year.

Computing can run locally on the edge boundary near the sensors and devices collecting the data, in the factory premises on a server, in the cloud or miles away in a data center. Where the edge computing occurs and how close it is to “the edge,” all depends on the deadlines, the amount of data, security needs and why the data is being processed.

The promise of computing on the edge is that it can reduce latency—the time a reaction returns after the data is sent, computed and interpreted into a command. The proximity to the source of the data and potentially the destination of the data is supposed to make all sorts of use cases possible, yet edge computing isn’t for every need. For some uses, edge computing makes a lot of sense, for others, computing can be done farther from where the data is generated.

IIC’s report defines edge computing as a logical layer describing resources that make overall IT infrastructures more efficient, not a shopping list of technologies defining how to do it.

But don’t underestimate it; edge computing will be a big deal, according to analysts:

• Data generated by mobile and IoT devices is growing so fast that, by 2022 half of all enterprise data will be generated outside the cloud or data center, compared with 10% today, according to Gartner;
• By 2020 the need to process that data efficiently into profitable services will push spending on edge-computing resources to 18% of all spending on the IoT, according to IDC;
• By 2022, IoT spending will reach $1.2 trillion (IDC).

Conflicting definitions of what the edge is come from vendors showing examples using the technology they do best; many are valid, none are definitive, said the IIC report co-author Todd Edmunds, senior solutions architect, IoT at Cisco and Lalit Canaran, vice president of Iot and Customer Innovation at SAP Labs. The report was also written by Mitch Tseng of Huawei, with contributions from other members of IIC’s Edge Computing Task Group and edited by computer scientist Stephen Mellor, one of the pioneers in Executable UML, who now is IIC’s chief technical officer.

“The edge isn’t a physical layer you can point at,” Canaran said. “It’s a logical layer based on the use case, or industry and you can’t get too caught up in how to define the physical edge.”

The report, includes a raft of use cases and variations illustrating how different placement and purchase decisions would be depending on changes in the value weighting of any of them to the business managers signing off on spending for the IIoT projects.

“There were a lot of vehement opinions about what people thought the edge was,” Edmunds said. “We’d usually end with a consensus that everyone could live with, but no one really loved. We did not define the edge by saying it has this amount of processing and that amount of storage because seeing what capabilities you need and where you should put your edge depends on the problems you’re trying to solve.”

Fig. 1: Choosing locations for edge installations is difficult because key functions cut across multiple platforms. Source: “IIC Introduction to Edge Computing in IIoT”

Unresolved technical questions aren’t the only things adding uncertainty, however, according to Jeroen Dorgelo, director of strategy for Marvell’s Storage Group.

“It’s not clear what business models will develop that will help produce the Facebook of the IIoT,” Dorgelo said. “The minute you’re enabled by all this data you can start doing predictive maintenance and all kinds of other things, but right now the projection is that by 2025 we’ll only be analyzing 1% of the data we collect and throwing the rest away.”

It’s easy to assume that any sufficiently large volume of data needing processing should end up in a data center, but most applications are fine with a response of 10 seconds or 10 minutes, so a round trip to the cloud is not always a latency disaster, according to Steven Woo, distinguished inventor and vice president of enterprise solutions technology at Rambus.

Security and privacy are a concern, especially when third parties are involved in monitoring either IoT devices or their data, and many companies don’t expect a good enough result from deep analysis of IoT processing data to go to the trouble, so the archive the data instead.

“That’s where the balance is,” Woo said. “You can do the processing at the edge or in the data center, but there is growing concern about the energy it takes to move the data all the way to the data center, especially on the chance you’ll do nothing with it.”

Fig. 2: Example of a logical architectural diagram for edge computing infrastructure. Source: “IIC Introduction to Edge Computing in IIoT”

Edge network infrastructures are still more theory than reality, but the drive to find the best distribution of effort to handle problems that might have extraordinary volumes of data, sensitivity to latency or both, in the case of autonomous vehicles, shows an approaching inflection point beyond which client hardware carry a much higher burden responsibility than in the past, according to Anush Mohandass, vice president of marketing and business development at NetSpeed Systems.

“Right now we’re seeing two distinct architectures for AI, which is the way to deal with all that data,” Mohandass said. “Training and analysis being done mostly at the core and inference more at the edge.”

Neuromorphic chips might reduce the need to move data and run training only in the cloud, but so far demonstrations from Intel Corp. and others have proven less reliable at adjusting to changes in environment than they’d need in applications with low tolerance for failure—in autonomous vehicles or robotics, for example, he said.

“We’re not there yet, but we are getting close to a time when the client hardware has a lot more intelligence and can be a lot more useful,” Mohandass said.

Practical considerations
Most of those issues are still in the future, however.

“Right now you’re making decisions based on three variables: Latency, connectivity and cost,” Edmunds said of the workgroup’s criteria for deciding how to distribute resources among devices, the edge network and the cloud or data center.

Two situations involving the same device could result in drastically different cost, location and infrastructure decisions depending on circumstances and goals, Canaran said.

Temperature sensors that only prove the HVAC is working in a single building are a much different animal than the same sensors numbering in the thousands tucked into trouble spots and potential points of failure in 35 factories spread out worldwide.

“In that case each plant is [a candidate for edge-networking resources] and you’d want to manage them all through the cloud rather than being there at just one of them,” Canaran said.

The IIC workgroup approached decisions about edge network infrastructure architecture as an engineering problem in which the goal is to meet varying operational goals of timing and function using greater or lesser resources and differing locations for the edge network and devices.

These examples illustrate different situations and responses within versions of the same industrial environment.

  1. Smart device, low latency, cloud control. A thermocouple, for example, could provide all its own edge-computing capability if it were smart enough to shut down an overheating pump. That would qualify as an edge-computing capability, but it would require little intelligence and would require connectivity only to report the shutdown after the fact.
  2. High-powered edge infrastructure, very low latency, higher capital cost. The edge-computing requirements would be much stiffer in an automated factory whose quality and performance were monitored by a system running near real-time analytics on heat, motion, vision and other sensor data from many points along the line. The risk of waiting for a response from the cloud would mean processing would have to be done on or near the factory floor, which would require the installation of new hardware that would be the factory’s edge.
  3. Cloud analytics, latency insensitive, long-term cost savings. Sensors and an edge network focused on collecting and relaying information to the cloud allow cloud-based analytics to examine details of day-to-day performance of individual machines in the factory and create a predictive-maintenance model that flags most problems before they occur.

The cloud needs a lot of data on each device but can receive them once per day or so rather than in real-time. That harnesses the ability of light computing resources in the edge network infrastructure to pre-process, structure and condense data from the IIoT devices in order to to minimize the data sent across the network. It also limits the amount of time the cloud spends processing them.

“We didn’t end up defining edge computing,” Edmunds said. “But we did use it to define the edge from the point of view of what it had to accomplish in terms of the problems at hand in a particular use case and the continuum of fundamental capabilities that might be needed.”

If anything, the report gives permission to do what works for the situation, which is what an engineer is going to do anyway.

Related Stories
Challenges At The Edge
Real products are starting to hit the market, but this is just the beginning of whole new wave of technology issues.
Processing Moves To The Edge
Definitions vary by market and by vendor, but an explosion of data requires more processing to be done locally.
Toward IIoT Security Standards
What the industrial IoT would look like if it was mature, secure and reliable.
How To Secure The Network Edge
The risk of breaches is growing, and so is the potential damage.

Leave a Reply

(Note: This name will be displayed publicly)