Power is still important, but more data and new applications push performance to the top.
For the foreseeable future, it’s all about performance.
For the past decade or so, power and battery life have been the defining characteristics of chip design. Performance was second to those. This was particularly important in smart phones and wearable devices, where time between charges was a key selling point. In fact, power-hungry processors killed the first round of smart watches. But as more chips are designed for applications other than smart phones or wearables, performance has returned as the primary objective.
There are several reasons for this. First, more chips are being designed to deal with a flood of data, and there is no single processing element that can keep up with that volume without overheating a system. As a result, some of the newest designs are a mix of accelerators. That includes some type of central logic for control, whether that is a CPU or an MCU (the demarcation line between these two processor types is no longer visible), and different types of memory scattered around a die or in a package. Even some of the software is being replaced with programmable logic because it’s faster. The result is a massive improvement in data throughput and processing speed over traditional SoC designs, with far lower resistance/capacitance, which heats up wires and slows performance.
These changes are essential in AI and machine learning applications, where the whole idea is to do multiply-accumulate functions as quickly as possible, ideally using some type of massively parallel system. But rather than using different chips in parallel, scaling has made it possible to put multiple processors on the same die or in the same package. So while traditional scaling doesn’t provide the same power/performance benefits as in the past, it does provide enough real estate to be able to achieve those kinds of improvements on the same die or between die.
This is apparent in some of the chips that are being rolled out in workstations and data centers. Just check out the new Mac Pro specs from Apple or the HPE ProLiant server specs. There are similar types of announcements surfacing all over the computing world, along with improvements across the memory spectrum for faster read/write times and throughput between processors and memory.
Second, companies are gearing up to plant a stake in the edge market, whatever that happens to look like. There is too much data to send to the cloud, so rapid pre-processing of that data will be essential. Whether that’s done at the point where data is initially generated, or whether it’s done in a series of interim compute sites and server farms remains to be seen, but processing speed will be the deciding factor in this market segment. The more processing that can be done faster, the more the cloud will be able to do what it does best, namely process data and identify patterns from more sources rather than dealing with massive compute jobs from everywhere.
Third, the tech world is starting to reach critical mass in terms of what can be done with strings of data rather than individual bits. A confluence of data mining, better AI/ML/DL algorithms, and more data from more sources has created a potential bonanza for gaining a competitive advantage across all businesses. The key now will be who can identify new market opportunities and competitive threats the quickest, and much of that depends on gleaning patterns from more data. The amount of knowledge has been accelerating for decades, although much of that has been confined to specific market segments. It’s about to take a giant leap forward over the next decade, and that jump will span multiple market segments.
This doesn’t mean there won’t still be an emphasis on reliability, security, area/cost and power. Power is an essential element in anything with a battery, and often the gating factor in how fast devices can run, and for how long they can run at maximum speed. Nevertheless, speed has returned as the key design metric in many designs, and it’s not going away anytime soon.
Leave a Reply