Mesh Networking Grows For ICs

Approach is yet another way to scale SoCs and systems, but it also adds new level of complexity.

popularity

Mesh networks were invented to create rich interaction among groups of almost-unrelated peers, but now they are showing up in everything from advanced chip packages to IoT networks.

The flexibility of a many-to-many peer-connection model made the mesh approach a favorite for two-dimensional network-on-a-chip topologies, to the point where they began to supplant data-bus connections during the mid-2000s. That flexibility kept mesh on designers’ short list as 2-D topologies gave way to 3-D, which scale and handle advanced packaging more efficiently, but leave create increasingly complex routing problems for designers trying to deliver the best combination of computing power and low energy consumption.

Lessons learned from reaching that balance in chips designed for IoT devices with low power requirements may be useful in designing networks of those devices connected using a mesh-capable extension for . The technology can be used to connect as many as 32,000 devices peerlessly, with enough structure to minimize both contention and power use.

Structured mesh topologies already are being applied inside advanced packages, for complex systems such as cars and industrial operations, and across many devices that need to communicate without the formalities of defined hub-and-spoke network structure and role definition. The idea is that chips can be developed independently, function independently, but organize dynamically and non-hierarchically to share information and work together efficiently.

“This is already being implemented on-chip,” said Ty Garibay, CTO at ArterisIP. “Now we’re working on ways to provide more automation so that customers can describe how different packaging types play together. This is another aspect of discontinuities that we’re dealing with in multiple places. If you look at 2.5D, there are power efficiency tradeoffs, and mesh approaches are an enabler. And you’ll see this more with chiplets, which will start showing up over the next two to five years for the majority of AI systems.”

After about a decade of development, advanced packaging has become a well-accepted approach to optimizing performance and power. But two challenges remain with advanced packaging. One involves cutting the development cost to the point where advanced packaging is the architecture of choice for more price-sensitive markets such as edge devices. The second is whether the time it takes to develop these packages can be cut sufficiently to give them a competitive edge.

Achieving those goals will require better tooling as well as more hardened IP.

“Rather than going to 7nm and beyond, where you have to deal with yield and cost issues, the goal is not to spin an entire SoC every time you want to go after a new market opportunity,” said Sundari Mitra, CEO of NetSpeed Systems. “The way to do that is to take core IP blocks like accelerators, and combine them with base die such as I/Os and PHYs, which can remain at bigger geometries. But the connection has to be correct so that you don’t end up with deadlocks, and you need an extensible network that allows you to do that. Whether that is purely mesh isn’t essential. You don’t necessarily have to be rigid on the topology. But you do want to connect this all together at the architectural level because verifying it is difficult.”

Packages designed for IoT devices with extremely low power requirements offer an especially complex mix of tradeoffs, often including analog, digital and RF circuits, antennas for several types of external communication, MEMs sensors, and even microcontrollers, according to Jeff Miller, a product marketing manager at Mentor, a Siemens Business.

Chipmakers targeting IoT devices with batteries that are supposed to last a decade or more have to bake the device’s primary functions into the chipset, integrated with rules or tools for communication and power saving designed to maximize battery life.

“These devices have to be highly optimized,” Miller said. “You’re tailoring sleep states to what pieces need to be on or off, eliminating communication between chips on the board for on-die communication, which tends to be lower voltage; we see more people are asking for customization because there’s so much variety in the applications and they can get a 60% to 70% improvement with custom silicon rather than general-purpose.”

What is a mesh network?
The idea of mesh networks has been around for decades, but was first implemented during the early 1980s by the U.S. military. The goal was to connect individual soldiers in the field using multi-hop routing from one individual to another, rather than trying to maintain centralized access points on combat patrol.

Complexity and hardware cost blocked that effort until the development of peer-to-peer networking during the ’90s made it practical for commercial use. The idea at that time was that if a sensor in a bridge detected a failure, it could wake up other sensors, which could then be used to collectively assess the extent of the damage and send out an alert.

Simple mesh networks helped make the IoT a reality. The addition of sophisticated mesh networking to Bluetooth made it possible to connect thousands of IoT devices efficiently for the first time. It also may have helped make chip design more difficult, while also allowing devices to work together as an army of botnets to create a distributed denial of service attack.

The July 2017 mesh networking extension to the Bluetooth Low Energy (BLE) specification expanded the scope of a mesh network without the complexity of hub-and-spoke subnetworks, with far lower requirements for physical hubs or routers, according to documentation announcing the release from Ericsson, a prominent member of the Bluetooth SIG that developed it. The specification does allow for dynamic, unstructured many-to-many connections to minimize the effort of management, but also includes the ability limit or create specific patterns of connections and create roles based on authority, physical placement, power requirement, on-board storage or network connections to help define the flow of information from rank-and-file nodes to collection points or communication gateways.

It uses a “managed flood” message relay that allows nodes to send and receive messages on their own behalf, and pass along messages aimed at other nodes, vastly extending the reach of the mesh, and give network managers individual- and mass-messaging options to make mass communication more efficient.

Security options include the use of different keys for authentication and encryption, so only nodes with permission to communicate outside the network are able to do so.

It’s also widely supported in smartphones, which allows end users to link their phones as a node in the Bluetooth Mesh to send or receive updates and to manage the network itself. Bluetooth’s chief rival for the role, Zigbee, does not have such support.

“This is a way to let you network a lot of these devices without having to give each one its own IP address, or WiFi login, using unlicensed spectrum and your own hardware,” according to Mike Demler, senior analyst for The Linley Group and Microprocessor Report.

Bluetooth Mesh is a big step forward in efficiency. However, due to the number of devices it can connect, the distance it can cover and comparatively low cost of building a local network using purchased networking equipment, this opens up a whole new option for either turning up performance or turning down the power.

“This is well designed to manage a smart office building or a network of sensors or a smart lighting system, where there’s not much interactivity and you’re only getting limited data, from very-low-powered devices, for almost no money using your own equipment,” Demler said. “Compare that to having to pay a wireless carrier 50 cents a light per month.”


Fig. 1: Bluetooth mesh example. Source: Ericsson

More connections, less control
Bluetooth Mesh allows users to decide whether every node can send notices to every other node or restrict the flow of messages to save energy and bandwidth. It also has a feature that allows one node to “friend” another and save energy by handing tasks to a machine with fewer power restrictions. That, in turn, allows the power-sensitive node to go back to sleep without skipping out on important tasks.

“We’re seeing quite a few access points operating on Bluetooth, especially Bluetooth Low energy, which has taken off,” Miller said. “It’s not really about whether to talk directly to the cloud or a cell tower; it’s a local area protocol. But it’s a great alternative to WiFi.”

Bluetooth is far from the only option in play in any IoT decision, however. IoT site Postscapes lists 13 communication/transport protocols available for IoT implementations on the local area, and another 7 across the wide area.

IoT stakeholders how have more than 30 connectivity options allowing different bandwidth, range, cost, reliability and network management, according to a November 2017 report from McKinsey & Co.

The number and variety of choices may be daunting to end users, but the need to support multiple choices is what makes things difficult for chip designers, according to Vic Kulkarni
vice president and chief strategist for the Semiconductor Business Unit at ANSYS.

“There is an insatiable demand for managing power in these devices,” Kulkarni said. “But the moment you put an antenna on a chip, whether you’re in IoT mode, autonomous driving, anything, you create quite a complex physics problem. If you’re losing power during operation there is a good chance there is a problem with your antenna and placement design, but it interacts with sensors and MEMS.”

Reducing the number of antennas and modems by, using one consistent network design, frequency or topology might simplify the chipset in a device, but the industry is still a long way from being able to standardize on one or two protocols, Demler said.

It is also a long way from being able to deliver off-the-shelf versions of chipsets with required levels of connectivity, power consumption and the idiosyncratic functional requirements that go far beyond the role of dumb or narrow-function IoT sensor, said Kulkarni.

“We find ourselves doing multidomain simulation of the chip vs. package vs. the system, performance of the antennas, a thermal analysis, because they’re all connected and have to be examined holistically and in terms of size and weight,” Kulkarni said. “You create a complex physics problem any time you put an antenna on a chip. But customers are asking not just for power reduction. They’re looking for efficiency of power management, customization of heat signatures, systems and network calls that manage energy well, and security that protects against side channel attacks, protects against acoustic and electromagnetic attacks. The complexity keeps going up and design turns into a massive data analytics problem looking for a way to connect all the dots and still connect the chip itself to the desired end result.”



Leave a Reply


(Note: This name will be displayed publicly)