Trouble Ahead For IP industry?

Could process technology changes be signaling a disruptive change for the IP industry? The relentless drive for lower power holds the key.


Power-aware design has risen from an afterthought to a primary design constraint for some design types. Initially it was smart phones and other battery operated devices. It has consistently expanded into additional areas including those plugged into the wall and those plugged into the grid. Some parts of the world are imposing restrictions on the power that a device can consume while not active, often called zombie power.

In the past, each semiconductor process node shrink produced smaller, faster devices that could run at lower voltage meaning they consumed less power. Power reduction appeared to come for free. Two things changed. The first is that designs became more active and switching densities increased meaning that temperatures started to rise. The second was an increase in leakage power and below 45nm leakage became a significant issue. The inability to continue lowering voltage at the same trend rate also had a significant impact on power. But at 16/14nm, with the introduction of the finFET, leakage is again taking a back seat and that has caused some people in the industry to question the decisions and directions that the industry has taken in the past ten years and they postulate that the future may look very different than it does today.

A Game Changer
While only a small percentage of designs are migrating to 16/14nm finFET, these designs, mostly destined for smart phones and other consumer portable devices, are the trendsetters for the industry. Perhaps the biggest difference between this and the 20nm planar processes is that leakage is significantly improved.

“This year’s move to finFET production has proven that process technology has indeed provided better leakage,” says Mary Ann White, director of product marketing for the Galaxy Design Platform at Synopsys. “Sure it introduces more capacitance, but it can run at very low voltages which in itself can save a lot of dynamic power.”

The industry is still trying to work out the best ways to optimize designs with finFETs. According to Drew Wingard, chief technology officer at Sonics, Intel talks about leakage in their finFET process and says that “they can have transistors that are fast and leaky or low power but slower, but they can’t have both.” Wingard says that finFETs give you more control than in the past, but “it does not make leakage go away. It is just less of a problem for the lowest power transistors.”

Mark Milligan, vice president of marketing for Calypto, sees an important change being caused by the reduction in leakage. “Now we have a dynamic power problem and that is the purview of design engineers. This is empowering because design engineers have something they can go after. Architecture and micro-architecture decisions can have a significant impact.”

While some are looking for additional standards in the specification and optimization of power, Milligan believes the reverse is true. “Designers are intelligent problem solvers. And now that leakage has been solved, or at least greatly impacted it, we can unleash the opposite of standardization, which is creativity. We can enable real design and allow designers to attack architecture and the dynamic power problem.”

Anand Iyer, director of product marketing for Calypto, adds further explanation. “Leakage is an issue when the circuit is not actually doing anything. This means that the architecture was never impacted by decisions to mitigate leakage. Risk, cost and schedule are always under consideration and they generally outweigh some of the architectural choices. Today, we think that architecture is becoming more important because it directly impacts dynamic power based on the activity of the circuit.”

Makimoto’s Wave
In the 1990s, Tsugio Makimoto, who was the chief technology officer for Sony, observed that the industry tended to go in 10-year waves, oscillating between standards parts and custom parts. The more recent turns of the wave are more questionable because the nature of designs changed with the emergence of the IP industry. This tended to favor standard parts and while chips often contain small amounts of custom circuitry, the majority was standard IP blocks.

Javier DeLaCruz, senior director for product strategy at eSilicon puts it into slightly different words. “There are opposing forces at play. High design costs have created a trend toward devices that can serve multiple markets, but extra unused logic on these devices, in some of the use cases, creates a waste of power.”

Fig. 1: Forces that create waves between standardization and customization.

In extreme cases, that wastage of power causes thermal issues on the chip and cores have to be powered down. “Dark silicon is sometimes caused by chips that cannot meet their power budgets,” says Krishna Balachandran, product management director at Cadence. “One way to resolve that is to offload functionality from generic cores. You have accelerators that can do specific functions in a more optimal manner. The general-purpose cores are still performing command and control functions, but the special-purpose hardware engines, which are more area- and power-efficient, could be working simultaneously.”

Balachandran says that this applies to heavy compute types of operations, such as certain types of video processing. “These cores can be shut off when not being used. This is an architecture that is gaining in popularity. A lot of architectural power optimization has been a black art. It is how one semiconductor company differentiates its products from another company. The move to IoT and energy harvesting and ultra-low power is forcing the industry to take a more serious look at what else can be done.”

According to Calypto’s Milligan, this architectural change coupled with a migration to finFETs may portend a bigger disruption. “There is an elephant in the room. The notions of IP and reuse may be in trouble. If you re-use IP from a previous technology and expect to resynthesize it, it may not provide good power results. When buying IP, it is unlikely to have been optimized dynamically for the use cases that the designer cares about.”

Not everyone is convinced by these arguments. “That blows my mind,” says Wingard. “If we accept at face value — which I don’t — that leakage is a solved problem, it means that a big chunk of power just disappeared, which means the requirement on the IP industry to minimize power would be reduced. This thesis implies that the requirement on them increases. This seems backward.”

A New IP Opportunity
There is at least one IP company that sees such a disruption as an opportunity. Adapt IP is creating and delivering IP at a higher level of abstraction and it is intended to go through a flow. “We will see more companies taking this approach,” says Milligan. “Google is doing that as well with the VP9. They are shipping C code to their partners. They also provide the option of getting RTL but the C code allows some micro-architectural exploration. IP reuse has been essential to get to where we are, so this will be a big problem. We can’t throw the whole baby out with the bath water.”

What kind of cores might the thesis hold true for? “We are beginning to have inquiries from small IP companies, especially in the video space,” Milligan says. “This is an industry driven by standards with a range of end markets. One company saw that they needed very different optimization and micro-architectures for different markets. What is required for a cell phone versus a car versus a surveillance camera are very different. There are many end markets for these products with different requirements.”

“Display controllers are an interesting example because they are fairly easy to understand, and we don’t often do much to power manage them because they run with an activity factor of nearly 100%,” says Wingard. “But this is actually not true. For example, a display has to do retrace, and during those times the controller is essentially idle. We have some examples that show that there is power that can be harvested there. But it needs fast control to take advantage of those windows.”

If this model does start to see penetration, use cases will start to drive the way in which blocks are optimized for power. Use cases are currently the subject of investigation for an Accellera effort titled the .

Cooperative Flow
A good design requires attention to detail at all stages of the flow. “You can have a great architecture and if you don’t implement it well, you will not meet the power budget,” points out Cadence’s Balachandran. “This means that lower-level power savings are equally important. With the migration towards using high-level synthesis, you can explore a lot of micro-architectures for a given architecture or algorithm. The decisions you make at this time, before RTL, are very important and the power savings here can be tremendous.”

Many of those high-level architectures are based on well-established patterns and the programming models to support them. “ is the bottleneck in many systems, or at least the transfer of data between the processor and memory,” says Balachandran. “There is research related to ways to reduce the transfer between them. How can this be minimized? How can the processor architecture be modified so that it accesses memory in more efficient ways? This involves algorithmic and architectural research, but the memory is still a problem. How do you make it faster other than integrating it into the processor? When 10nm becomes mainstream there could be a fundamental re-architecting.”

Even if activity has been optimized for a given task, the architecture of the implementation can still have a major impact on the total power and energy consumed for that task. A simple change in the memory architecture or the decision to put a part of the solution on-chip or off-chip will significantly change the performance and power of the solution. Thus, we have to consider many of these issues at the same time. Some call this shift left, others see a need for methods that would allow diving down in the design process to investigate the impact that early decisions may have on implementation. This could provide more information that can be used to refine ideas and help in making the important decisions.

The IP industry is probably safe for now given that only a few core types may see a change toward architectures that are highly impacted by power considerations. Most of the functions of a smart phone, namely the applications processors and the infrastructure to support them, are not tuned toward a single function. Instead, they use a suite of programs or a benchmark to dictate their optimization. But bigger changes may be further down the road when the low-hanging fruit has been picked.

Leave a Reply

(Note: This name will be displayed publicly)