What Happened To UPF?

Did the power intent standard miss the mark, or is it quietly being adopted? It’s hard to tell.

popularity

Two years ago there was a lot of excitement, both within the industry and the standards communities, about rapid advancements that were being made around low-power design, languages and methodologies. Since then, everything has gone quiet. What happened?

At the time, it was reported that the IEEE 1801 committee was the largest active committee within the IEEE. Did it go too far, too fast? Did the technology produced not hit the mark? Is it only useful to a small segment of the industry?

The Unified Power Format (UPF) was constructed in response to industry demand, but was rushed due to competitive pressure from the Common Power Format (CPF). The 1.0 version was quickly replaced by 2.0 and had significant changes. Then UPF and CPF started to come together. With UPF 2.1, which was released as IEEE 1801 in 2013, more stability came into the field, but there was further work that the industry felt was needed. That led to the creation of UPF 3.0, which was released as IEEE 1801-2015, as well as the creation of two new groups IEEE P2415 and IEEE P2416. 2415 is titled Standard for Unified Hardware Abstraction and Layer for Energy Proportional Electronic Systems, and 2416 is titled Standard for Power Modeling to Enable System Level Analysis.

Customer adoption of new technologies tends to start slowly, but typically papers begin appearing in conferences such as DAC and DVCon, showing how customers are finding value in the technology—or at the very least, the problems they are having with adoption. Looking back over the past 12 months, there was one paper at each – hardly a swarm of papers that would indicate widespread public adoption. EDA vendors continue to provide tutorials and seminars on the subject, indicating that some companies may just be starting adoption or spreading it within their companies.

A look at the numbers
Two EDA vendors, Mentor, a Siemens Business and Synopsys, conducted industry surveys that were published in 2016. Those provide an indication of adoption levels. But because they are based on information collected in 2015, they do not really capture adoption rates for UPF 3.0.


Fig 1: Adoption of power intent. Source Mentor, a Siemens Business/Wilson Research Group.

Figure 1 suggests that adoption of power intent descriptions had flattened out, and the only significant change between 2014 and 2016 was an increase in adoption for UPF 1.0. This may indicate that the larger market is just getting started with adoption of .

The second survey (Figure 2), also based on 2016 data, is from Synopsys and splits out the techniques that companies are using in multiple market segments.


Fig 2: Mobile SoCs Driving Low Power Complexity Growth. Source: Synopsys – DVCon 2017

Figure 2 shows clock gating as being the most common power reduction technique in almost all market segments, and that voltage reduction techniques and the utilization of multiple threshold voltages outweigh the adoption of multiple power domains for most market segments. The first two techniques do not require the incorporation of a power intent description, potentially indicating that the inclusion of UPF throughout a flow is difficult for companies that do not have a dedicated CAD team.

The power intent formats concentrate on multiple power domains utilizing state retention, and thus it would appear that UPF should be of interest to about a quarter to a third of the industry. This matches up somewhat with the results in Figure 1.

What changed in UPF 3.0
UPF 3.0 concentrated on improving or adding capabilities in the following areas: a bottom-up implementation flow, power models, information models, and high-level power analysis. “UPF 3.0 came out in 2016 and all EDA vendors are actively working on it,” says Gabriel Chidolue, verification technologist within Mentor. “This compares to UPF 2.0 where there was very little support for it even after three years. So the rate of adoption is improving. This is in part because the methodology has solidified and the people who are hitting the issues would be looking toward their vendor to add support for the necessary feature sets.”

The aspect of UPF 3.0 that provided the most adoption pressure was the support for the bottom-up implementation. “There is the concept of terminal boundaries that allow people to do more reliable bottom-up and top-down design flows, and customers are adopting that,” says Mike Lucente, product management director at Cadence. “2.0 had the concept of successive refinement, but 3.0 improved on that and cleaned up some inconsistencies and we are seeing adoption of that.”

As a result, IP vendors can supply UPF files that can be used by system integrators. “It started to happen with 2.1,” says Chidolue, “but there were issues and 3.0 does solve it by adding semantics and encapsulation mechanisms.”

Lucente explains the concept of terminal boundaries. “The last thing you want as an IP provider is to define a power architecture and then to have that mucked around with at the SoC level. The idea of being able to put up a wall around the IP was important and enables them to provide the power intent along with the IP. When they hand off the IP and deliver the power intent with it, they want to make sure it is used within the context that they expected.”

“The only areas where the methodology needed to be expanded were in managing IP and the ability to use them in different power contexts,” adds Chidolue. “Users are solving the problems today, often in the ways available to them in UPF 2.0, or 2.1 and while 3.0 actually solves the problem, it just takes a little bit of time for users to get there and consume the standards.”

High-level modeling and analysis
Another new feature area was high-level power management. “This requires a little more thought because there are other competing standards or standards that it needs to interface with,” points out Chidolue. “For RTL and the implementation side of things, the use of power models was an extension to what had already been started when CPF was brought into UPF in 2.1, and it followed on that theme. 3.0 wanted to consolidate that and make it even better.”

At the time of the release of UPF 3.0, Alan Gibbons, power architect at Synopsys, talked about the need for this. “The ability to develop energy-efficient platforms, including the hardware, software and system power management components of the platform, requires the ability to use appropriate levels of design abstraction for the task at hand. So it follows that our abstractions of IP power behavior also should support that requirement. With UPF 3.0 we can now extend that to IP power modeling and provide an efficient way to model the salient power related characteristics of a piece of IP for use at the system level.”

Arti Dwivedi, senior product specialist at ANSYS, added that “while it is essential to perform verification of the design’s power intent, it is also critical to perform early power budgeting. Functional verification ensures that the implemented power management techniques meet the specification and the functionality of the design is not broken during implementation. Similarly, early power analysis ensures that the design power budget is met, and power debug provides guidance to achieve this goal if the initial power target is not met.”

Adoption of this appears to be a lot slower. “While other parts of that standard have been had healthy adoption, one area that has not resonated with customers as much is the component modelling to drive system-level modeling,” admits Lucente. “Instead, we are seeing a lot of interest with system-level analysis that focusses on getting higher accuracy and the ability to quickly make tradeoffs by driving real-world stimulus into the DUT, especially when using emulation. This allows you to run software and drive this low-power analysis flow and make tradeoffs. The accuracy is pretty close to signoff — within 10% to 15%. People are more interested in having accuracy rather than bringing it in earlier in the flow where you could make tradeoffs, but you are using data that is less accurate and begs the question of the relevance of what you are seeing.”

IEEE 2415 and 2416
The ability to abstract and package key power-related characteristics of a piece of IP and present those characteristics in a way that is suitable for consumption by software development teams would appear to be a useful capability. It may be a case of too many cooks spoil the broth.

The creation of three separate entities to handle different aspects of power may have slowed the progress of all three of them. The abstract for IEEE P2415 states that 1801 is focused on the voltage distribution structure in design RTL and below. It points to minimal abstraction for time and depends on other hardware-oriented standards to abstract events, scenarios and clock trees, which are required for energy proportional design, verification, modeling and management of electronic systems.

2415 aims to provide abstractions of hardware, as well as layers and interfaces in software that are not yet defined by the existing standards. In 2016 Aggios donated technology developed under a research grant from the California Energy Commission. So far there is no date for a release.

The stated goal of 2416 is to provide accurate, efficient, and interoperable power models for complex designs. It received a technology donation for Si2 in 2017 for power modeling technology that would fill several holes in the flow for estimating and controlling SoC power consumption. Again, there is no date for a release.

“We are waiting to hear from them,” says Chidolue. “We have put the necessary interfaces in place in 3.0. We have the necessary overlap such that a system model can capture the high-level specification needed and to interface that into a UPF environment. Now we need them to do the work. When they have something we will work to ensure that the interfaces put in place are still valid.”

Success or failure?
Not all aspects of UPF can be declared to have succeeded at this point, but does that mean that the standard has not been successful? “We are not seeing widespread adoption of UPF among our customers,” proclaims , president and CEO of OneSpin Solutions. “Maybe UPF is not the right solution to a very important and complex problem. UPF has proven to be too inflexible and demanding on the tool chain, while not keeping up with the dynamics of power requirements and solutions. Requirements on power have been extraordinary and the complexity of power management has become huge. As a result, many companies have evolved beyond UPF and created their own solutions for power-related design and verification.”

Indeed, the Mentor survey shows that 20% of customers are using internal solutions and few of those people are dropped their solutions. Brinkmann points to a strategy that many of them are using. “The design solutions usually include a combination of power mode control registers in hardware that are accessible through some bus interface from a CPU core or in many cases a small dedicated power controller. The power control is implemented in firmware. This allows flexibility and permits changes to the system once deployed. The biggest challenge is verifying power-related system functionality with the combination of hardware and firmware. This is an emerging area for formal verification and an area of active research with academia and OneSpin working together. Ultimately, power design and verification constitute a system-level challenge that requires a system-level solution. Among the companies we survey, UPF is an incomplete solution and more is needed.”

Mentor’s Chidolue sees it as being a little more complicated than that. “Unlike other languages, UPF permeates the entire process. It impacts verification, implementation, and it is important starting from synthesis all the way to place and route. Different tools use different aspects of UPF, but in general it permeates various tools chains and flows.”

Another recent trend is the incorporation of direct power control capabilities into the design itself. An example would be (DVFS) that reduces frequency and voltage to match the minimum necessary to perform the task. This requires knowing the exact temperature and process skew and that can be supplied by on-chip sensors made available by Moortec. It forms a closed loop system that reduces total energy and would be designed and verified using traditional approaches, not using UPF.

Cadence’s Lucente believes that the lack of attention is because the industry has become more comfortable with it. “It is no longer new and different. I would not mistake the quiet and associate that with not adopting. This is the opposite of what we are seeing. We see strong adoption of the new capabilities in UPF 3.0 across the customer base. We have 40 customers doing 7nm designs and 30 tapeouts at that node and these represent the most advanced low-power designs in the world. Those customers are driving us hard to support the latest aspects of the standard.”

It may still be too early to tell. Not all tools have implemented UPF yet, and high-level design and analysis tools continue to see very slow adoption across the industry. Some of the capability put into the standard may have been a result of thinking that this situation would change, but for the past twenty years, the adoption of high-level tools has always remained just a few years into the future. This does not mean that the standard is not providing value, nor does it mean that the standard could not use improvement.

There has been talk about UPF 3.1. “A number of customers have looked at various aspects of it and they already know what they want to adopt,” says Lucente. “We have been working closely with them to implement the capabilities we expect to see in 3.1, although we have not formally announced any intentions to support it yet.”

Related Stories
New Thermal Issues Emerge
Heat is becoming a bigger problem for chips developed at new process nodes and for automotive and industrial applications.
Power Modeling And Analysis
Experts at the Table, part 3: Juggling accuracy and fidelity while making the problem solvable with finite compute resources and exciting developments for the future.
IP And Power
How can power be optimized across an entire chip when most of a chip’s content comes from third-party IP?



1 comments

Gordon Walsh says:

I’m always concerned when I hear talk of IP level power modelling being used to drive System level power modelling – the reason being that there are System level effects (dft insertion, clock tree insertion, floorplan/layout/timing closure effects …) that can contribute an unpredictable amount of addition power. Doe UPF 3.0 account for that or should 10/15% be added to the IP level power when bring up to System level?

Leave a Reply


(Note: This name will be displayed publicly)