Where and how IoT engineering teams are approaching design optimization among various layers of the IoT.
As the number of connected devices rises, so do questions about how to optimize them for target markets, how to ensure they play nicely together, and how to bring them to market quickly and inexpensively.
IoT is broad term that encompasses a lot of disparate pieces for devices, systems, and connected systems. At the highest levels are hardware and software, but within those two groupings there is everything from back-end processors to analog sensors, multiple layers of software, many thousands of IP blocks, and as-yet undocumented use cases. Architectures are still evolving, communications protocols and technologies are in flux, and security is just beginning to get widespread attention from chipmakers.
Nevertheless, there are a number of developments that could have broad impact across these different areas:
1. Different packaging approaches, which could provide significantly faster assembly strategies for disparate parts.
2. Better utilization of hardware-software co-design to maximize work done per cycle, and a better understanding of what works best in hardware versus software.
3. More system-level design approaches.
Advanced packaging
For the past six years, advanced packaging approaches have been getting a second look as it becomes more difficult and expensive to create SoCs at the most advanced process nodes. Lithography problems, thermal issues stemming from an increase in dynamic power in finFETs , RC delay, and an overall increase in complexity have forced chipmakers to look for alternatives, ranging from fully-depleted SOI planar devices to embedded FPGAs.
Advanced packaging is another option. Falling somewhere between the power and performance benefits normally associated with an ASIC, and the programmability of an FPGA, the idea is that this approach will allow chipmakers to add different IP blocks, regardless of the process used to develop them. While that still isn’t proven as a mainstream approach, Apple’s iPhone 7 provides at least one proof point of a high-volume implementation using TSMC’s fan-out technology. In addition, most of the big processor companies are working on 2.5D implementations for servers and GPUs. On top of that, there is work underway by a number of organizations and companies to iron out the methodologies and establish standards to that end. SEMI and several IEEE groups have teamed up to create the Heterogeneous Integration Roadmap, and a number of companies, government agencies such as DARPA, and top universities have jumped in.
“Analog IP on an SoC at 7nm is a hostile place,” said Jeff Miller, product marketing manager at Mentor Graphics. “That’s not where you want to be doing analog. But you can take 130nm analog sitting on a PCB and bring it into an advanced package. So if you have image sensors, they can be on an image processing die.”
Miller noted that these kinds of decisions are particularly important for MEMS chips. “Packaging is a big challenge for MEMS,” said Miller. “But you can collect IP such as a magnetometer, put several die in a package, and it can serve a particular market need.”
There is growing support for this kind of scheme. William Chen, senior technical advisor at ASE Group, said there is a grass roots movement by engineers to embrace this approach. “This is what they’re thinking about,” Chen said. “We’ve been giving workshops in Japan, Taiwan, Europe, Malaysia, and China, and we’ve been getting lots of interest, particularly in heterogeneous integration. That’s the real power of the fan-out, where you can put two different nodes in the same package.”
Hardware-software integration
While hardware is almost always more efficient in terms of power and performance, software is much easier to adapt to specific use cases. The challenge is balancing that flexibility with power and performance. That may require rethinking the fundamental interaction between software and hardware, as well as the best ways to utilize each. For example, how many cycles does it take for software to complete an operation, and is there a better way to do that.
“Each application must be optimized for the specific target, so flexibility is needed in all of the IoT layers to be able to target the design for the specific task at hand,” said Phil Dworsky, director of strategic alliances at Synopsys. “This means power, performance, area, cost, and there are a lot of different things that need to be able to do that. They need the flexibility in the design to be able to turn the knobs in the IP that they are using. They need to match it for the specific application so the right level of processing power, the right kind of energy harvesting, are being met, as well as the right kind of approach for implementation. All the different aspects of the design flow are impacted by that. You need to be able to customize and optimize for your specific target.”
Dworsky noted that standards have really helped in this regard, “abstracting away the hardware so that the effort can be done in software to target those specific application areas within the cloud.”
That extends to connectivity, as well, which is one of the key benefits of an increasingly connected universe of things.
“We’re not sensor guys but we are processor guys and we are system guys so we can actually focus on providing the right IP for the right task — both configurable and a slew of products that go with it,” said Nandan Nayampally, vice president of marketing of the CPU Group at ARM. “If you look at the digital part of an IoT endpoint, it’s the smallest part on die. How do you make the ‘I’ part of it — the connectivity part of it — more efficient? Also, how do we (ARM) make it more efficient? We can design end to end, which we have, but we also define boundaries where you could mix in radio front ends from somebody else or software stacks from somebody else, so we are providing whatever we can to make that more efficient.”
Nayampally believes a generation or more can be gained in terms of performance or efficiency if the memory system and the software are improved. “We find that the optimization of the software is substantially more necessary to make these things happen. This problem will go across not just the small microcontrollers, but to the intermediate devices, to the gateways, and all the way to the cloud. When talking about IoT, the focus shouldn’t always be on the endpoint. Literally, everything in between has to be optimized, and the datacenter with a dumb pipe to the end is not how you’re going to solve the trillion-device problem.”
There is a fair amount of debate these days about whether the hardware is defined by the software, or whether the hardware is still the determining factor in a design. The answer is it depends on the application, the company developing the chip, and the size of the end market for those devices.
“You see all the high-end custom hardware, especially in mobile and networking around the A-class CPUs from ARM, but with the range of applications or the different layers of IoT, maybe this is actually the point where hardware does become truly commoditized at the different levels and becomes much more of a software thing,” observed Larry Lapides, vice president of sales at Imperas Software. He noted that despite talk about hardware being abstracted away, that hasn’t happened.
“There, standards would be good,” Lapides said. “Being able to define the different places where one layer of IoT connects to the other layer, where edge connects to gateway connects to cloud, is important. Understanding the different requirements and the different optimizations at each level is also important.”
As an example of this, security needs to be implemented differently, he said, due to varying costs. “You can’t have the same level of security on an edge node as you do on a gateway as you do on a cloud because you just can’t afford it. From a cost and power perspective you need to keep the different requirements in mind as you optimize each layer.”
Synopsys’ Dworsky agrees, noting that the abstraction allows for hardware optimization. “It lets you pick the best hardware for the task, because the software can run on whatever that optimized hardware is. So if I need an ultra-power-efficient thing, as long as it implements the interface layer, the API, my software is going to run on it. That allows me to pick the best hardware for the task. That’s exactly where we are headed.”
Adds Nayampally: “It’s not commoditized, it is optimized for purpose. So the fact that you may not agree on what’s underneath is not a problem as long as that’s the best thing that you can, as opposed to I can just replace anything.”
Tighter integration
Behind this discussion about hardware versus software is another discussion about how to optimize each, rather than worrying about which one is driving the other.
“Having been through a 20-year journey of building what I think is still the world’s most highly configurable IP, those knobs are all in there for reasons,” said Drew Wingard, CTO of Sonics. “And while I love the idea that we abstract upwards so perfectly that software doesn’t have to care what’s inside the hardware, I look at the real designs and they still have an enormous amount of custom hardware that is specific to optimizing it, and where there is new software that has to be written specifically to that piece of hardware. It does slow down designs. It does slow down progress.”
This was one of the drivers behind Sonics shift in focus to power optimization, Wingard said. “Clearly, power is a huge issue for IoT, and we’ve been looking at the level of optimization that asks what we can do without asking for help from the software. How far can we push it in the level of making the hardware automatically determine that if it’s idle it shuts itself down?”
The system view
While an enormous amount of existing technology is being used to design and build IoT devices, there needs to be a shift in mindset about how this technology goes together with the least amount of extra development time or expense.
“If I have a great idea for my startup, and it’s an IoT product, I need to look at it systemically,” said Frank Schirrmeister, senior group director, product management in the System & Verification Group at Cadence. “I would like an environment which optimizes across the different layers in the IoT system. Having a processor architecture with its associated development tools around it, across the different layers, is actually probably not a bad thing. I would like that because I wouldn’t have to train my team on developing for IA (Intel Architecture) on one thing, and developing for an ARM A-57 on another. I essentially want a development kit to put these things together, to optimize across the different elements.”
That includes everything from security and performance, to power management, said Rajesh Ramanujam, product marketing manager at NetSpeed Systems. “You could do everything in software in making sure you have a standard, but at the bottom line, you want to make sure you meet those metrics in terms of an IC. Ten years ago we used to look at individual pieces of fairly simple SoCs by themselves. Now we are at the point where we have to look at SoCs like a system, and we are going beyond that with multiple systems. You must take a step back and look at a system from a much bigger boundary because we can’t afford to optimize performance, power management, and security at subsystem level. It has to be taken at a system level. And we need to take leaps. It’s a long way before we can actually start doing something like that.”
That requires taking a step back and emulating the performance of all the systems together. “There has to be a way to speak the same language across all the subsystems, be able to evaluate all of them concurrently as opposed to doing them piecemeal. The same things applies in security. You could test the heck out of security in a subsystem to make sure it works perfectly fine individually, but when you put all of them together, bad stuff happens.”
This can be done in virtual prototyping or emulation, but it may require a different way of looking at the problem.
“The key is that you have different tools for different tasks,” said Dworsky. “There is a continuum of different tools used. Software programmers need to use the same tools they use in the real hardware. It has to feel to them like they are piloting real hardware. Virtual prototyping gives us a lot of that at the beginning of the design and process, so the hardware people are able to define the architecture, to try different architectures. Then, the software people are using the actual hardware, and maybe even more importantly, the actual traffic that they’re going to run in the system. So there will be different profiles of users or applications, and those real profiles will be running against a real prototype of the design. Then engineers can move between the continuum pieces for different levels of accuracy, different levels of measurement, and different purposes.”
Conclusion
The Internet of Things means many things to many people, but the bottom line is that ultimately it will be a collection of devices that will need to communicate with each other and to perform some localized functions. How each of these gets optimized, and for which markets, is still being worked out, but it will not be a one-size-fits-all type of solution.
If initial indications are correct, the direction appears to be more customized implementations for very specific applications, each with a different set of criteria for what makes an optimal device. If the market plays out that way, it will require new approaches, new methodologies, as well as some new technologies and tools, and some fundamental shifts in how the chip business is run. When it comes to the IoT, this is anything but business as usual, but it is still business.
—Ed Sperling contributed to this report.
Related Stories
Where Are The IoT Industry Standards?
While some Internet of Things groups are proceeding with setting standards, connectivity and other aspects are still up in the air.
What’s Important For IoT—Power, Performance Or Integration?
Experts at the table, part 2: Why it’s so hard to make the IoT secure, and different approaches for reducing power.
Tech Talk: Sensor Design
Common pitfalls in designing IoT devices.
Leave a Reply