Power Modeling And Analysis

Experts at the Table, part 3: Juggling accuracy and fidelity while making the problem solvable with finite compute resources and exciting developments for the future.

popularity

Semiconductor Engineering sat down to discuss power modeling and analysis with Drew Wingard, CTO at Sonics; , CEO at Teklatech; Vic Kulkarni, vice president and chief strategy officer at Ansys; Andy Ladd, CEO of Baum; Jean-Marie Brunet, senior director of marketing for emulation at Mentor, a Siemens Business; and Rob Knoth, product manager at Cadence. What follows are excerpts of that conversation. Part one can be found here, and part two here.

SE: Fidelity is what is needed rather than accuracy. The important thing is that any decision goes in the right direction. But the user community appears to be infatuated with absolute numbers. How are you as an industry going to change the perception?

Ladd: It is hard. The first thing they want to know is how accurate the model is. If it is not accurate they lose confidence in it. But a lot of time, especially early on in the design stage, absolute accuracy is not needed, you only need relative indicators and if a change will improve things. But they still have the mindset that they want accuracy.

Knoth: It is a continuum. You can’t throw accuracy out the window, but there is a time and place for it.

Bjerregaard: And it is a continuum across abstraction levels, as well. If you have models that are used at different abstraction levels, then you can start trusting regressions and you can start to converge on things that you couldn’t converge on before. Then it makes sense. Once they have that and see the value, I believe people will have a higher confidence in having the fidelity rather than absolute accuracy at the front end. The problem is that if you have just the front end or just the back end, two different numbers, then they want absolute accuracy everywhere.

Knoth: Even at the very front end, there is some place for those absolutes. If you are doing FMEDA analysis of the design you may need a ball-park power number. It doesn’t have to be 5% but you can’t just say a delta at that point.

Bjerregaard: In fact, when you plan the backup, you actually need the power of the design before you can determine what power grid you need so that you can determine what footprint is big enough, and that determines profitability. So you need to know as precisely as you can.

Brunet: In this industry, you cannot avoid the accuracy conversation. We can talk to them about fidelity, but they always relate back to real silicon. RTL must be within 10% to 15%, and if you don’t provide models that match that, then they will not listen.

Bjerregaard: And we should target that. We should not throw it out of the window and say you are wrong, you don’t need accuracy.

Kulkarni: Nobody will believe you without proof points. If you slice and dice the components of accuracy or inaccuracy, we found that the biggest culprit is clock. That creates a lot of problems unless it is modeled properly. Predicting what synthesis will do in the combinatorial logic is difficult. The third one is the nets themselves – just the wire lengths. Calibration methodologies can help where you create a reference design first. Then, for that class of design, we can extract the base models and tie those down to the decisions made up in the RTL world. The means that the band of accuracy becomes predictable.

Wingard: Convergence is key. As designers we can handle inaccuracy as long as we have the belief that we are converging. Early in the design what we want is an envelope. The architect makes their choices based on an envelope of outcomes and that has some inaccuracies associated with it. Do they wish they could find the best solution – of course they do, but they never manage to. But there will always be another project and they just need to get within that envelope that is within their guidelines. If the results are less accurate, then they possibly have to work a little harder than if the results were more accurate, but if you have established confidence and that you are converging, then they can deal with it. Architects are good at dealing with uncertainty. It is part of their job description.

Bjerregaard: It is about diminishing returns, and you get as far as you can and tape out.

Knoth: This is part of shift left that has to happen. The architects are comfortable with that uncertainty. Yesterday, power was a back-end concern and so that dominated the conversation. But the more we shift left, the more you will be able to operate in that relative trend.

Ladd: How much time are the architects truly spending on power versus performance? I think they spend a lot more time on performance. Power is something that is pushed off until later.

Bjerregaard: That is because historically performance was king. That is how you sold your product. But it is shifting. Power is becoming more important and becoming a competitive parameter.

Wingard: Maybe it is because of package costs.

Knoth: Package costs, reliability, lifetime – it is not more important, but it is the flip-side of performance. You see people who use that as a budget to go one way or another.

Wingard: The goal has to be to get architects off the spreadsheet as a power model. We say they focus on performance, and they do, but they all have a power estimation spreadsheet.

Brunet: Already within large mobile companies, having the power analysis guy talking to the emulator was a challenge. They do not speak the same language. There are a lot of cross-functional connections that have to take place – not only at the architecture level. We had the same challenge 20 years ago with timing-driven place-and-route. P&R people did not talk timing but now they have to. For power it is the same thing.

SE: Power is such a huge analytical problem because it is an analog problem across the entire chip, with multiple planes of coupling. We talked about some, such as memory utilization, thermal, power grid – the use cases. These are all coupled together and will affect power. Do we even have the computation ability to be able to estimate all of that without some inaccuracy?

Wingard: We did the same thing with performance. We make out choices without simulating the whole system with the software.

Knoth: And we are getting better. Emulation capacity and the ability to tie the accuracy into C level modeling — the tools are improving. We are getting there.

Wingard: I was glad to be a part of the transition when we started to do performance modeling in ASICs. That didn’t used to happen. Secondly, we did that by trying to abstract away the functional blocks. Performance analysis of the chip is primarily about looking at the fabric and the memory system. So all we need is transactions. That enabled us to separate concerns and get to reasonable results. Good enough results to make choices. We will do the same thing for power.

SE: What do some of those abstractions look like?

Knoth: I think we touched on a couple of them. UPF 3.0 is a great example. Getting more robust support for that, getting it populated as a standard view that gets delivered.

Brunet: UPF is probably the first one that drives the software requirement a little bit, to get to a gate-level or an RTL netlist. So there is a little bit of software driven hardware which is a first. Not everyone is using UPF today, by far.

Wingard: Those who are on UPF are probably not even fully on UPF 2 yet, let alone UPF 3.

Brunet: So there are some good technologies.

SE: But that is defining power control, not analysis.

Ladd: You have to start somewhere.

Wingard: I would love to take an abstraction of the model and be able to have confidence that the model can be produced at the component level with comfort and that the system context in which it is going to be used will not rendered it inaccurate.

Ladd: Right, the scenario has to be realistic enough that you will trust what comes out of the tool.

Wingard: And there has to be some shmooing of behaviors around the model when producing it so that you are OK with that model being placed into a context different from the one in which it was originally analyzed.

Brunet: The industry that will force this to move very fast is automotive. You have a lot of ECUs, and all of them are plugging into the battery. The OEMs will push the industry to move very fast from a system level.

Knoth: It is not just the battery, it is lifetime and reliability.

Wingard: They want to move fast, but if they are safe they are slow. From start to deployment is still five years. We are about to go into the trough of disillusionment in automotive.

SE: What exciting developments are happening today?

Kulkarni: Machine learning (ML) is exciting. ML is now being used to predict electromigration. This allows people to make a lot of decisions about 5,000 to 10,000 violations. This is an artifact of heat and power. We are at an inflection point for the industry. We expect to see the same things for power closure, timing closure. We are entering the world of big-data analytics. About 60% of IP is re-used within a design, so prior knowledge has to be captured better.

Ladd: I am just excited to be in this area. EDA has a huge opportunity here and I love being in the position where I can influence something where there is a big gap. We can do a lot better.

Brunet: Power is a fantastic opportunity for emulation. They need real, live applications. You can spend cycles at the end to try and get 5%, which means you wasted millions of dollars. Lots of data, you have two platforms for that, FPGA prototyping and emulation. One doesn’t have visibility, the other does. It is a good fit and will continue to drive a lot of emulation business.

Knoth: Power is a system-model problem, and that is what makes it exciting. If you look at the people sitting around the table and the areas that we cover, it is not something that one domain has to carry the banner for. It is a software problem, it is an emulation problem, it is a high-level synthesis problem, it is implementation, it is RTL analysis and it is IP. We all have a part to play in this, and that makes it fascinating. If you carry that solution all the way through the system, that is your optimal outcome.

Wingard: There is an enormous amount of idle time in almost all circuits being designed that we are not harvesting. Using distributed hardware-based control should be able to take advantage of that. There is a huge opportunity to reduce energy or power of these chips very directly.

Bjerregaard: What is most exciting to me is that we are seeing this convergence of continuity across abstraction levels. That is the most important thing. In the end you have a physical product and you need that power to match what you expected at the front end. This continuity across abstraction levels is important and is exciting for the future.

Related Stories
Power Modeling And Analysis
Experts at the Table, part 2: What does a power model look like and how do you ensure software utilizes power control properly?
Power Modeling And Analysis
Experts at the Table, part 1: Are power models created early enough to be useful, and is that the best approach?
The Time Dimension Of Power
Power is a complex multi-dimensional, multi-disciplinary problem. Does your flow address all of the issues?
Transient Power Problems Rising
At 10/7nm, power management becomes much more difficult; old tricks don’t work.
Toward Real-World Power Analysis
Emulation adds new capabilities that were not possible with simulation.
Tech Talk: 7nm Power
Dealing with thermal effects, electromigration and other issues at the most advanced nodes.



Leave a Reply


(Note: This name will be displayed publicly)