First of three parts: Higher levels of abstractions and tradeoffs in design.
By Ed Sperling
Low-Power Design sat down with Walter Ng, senior director of platform alliances at Chartered Semiconductor; Brani Buric, executive vice president of sales and marketing at Virage Logic; John Sanguinetti, CTO at Forte Design Systems and Andrea Kroll vice president of marketing and business development at JEDA Technologies. What follows are excerpts of that discussion.
LPD: What will the differentiator be in the future?
Sanguinetti: We have to take as input designs that are relatively high level. Our customers tell us the higher the better. We spend most of our time in raising the level of abstraction. But we also have to be cognizant of what happens downstream. We don’t want to put out RTL that is junk. And it can’t just be acceptable to the tools. We want to put out RTL, for example, that can make dramatic improvements to power optimization. That’s true with any of the logic synthesis tools. But we don’t have very good visibility downstream, so we’re very reactive.
Ng: On our end, everything is about cost. Cost touches time to market and all the specs. We’re seeing much more widespread adopting of ESL tools. If you’re looking at power and selection of architectures, that can be most influenced at the system level. At the gate level, optimizing on power techniques, the wrong selection of an architecture can more than skew what kind of power you’re going to get. With more integration happening because of silicon’s capabilities, the more complex functions and the more important ESL should be. Getting from the algorithm level through system description, RTL and the physical implementation is a huge cost. The verification of each one of those representations is a huge cost. The more structured that approach can be, the better. Most of the leading edge of design is consumer applications, and these are extremely cost and power sensitive applications.
LPD: Why hasn’t ESL taken off until now?
Buric: It did. But there’s a difference between what has been used and what is commercially available. ESL has been adopted in one form or another over the last 15 years, at least. Most of those are internally developed tools targeted at a particular application. If you want to move from there to the commercial market, you have to make those tools very generic. That makes the problem more difficult to solve. But complexity of designs is going through the roof, so you have to start using these kinds of tools everywhere. If you’re not first-time right, you can lose your business or even your company, but you certainly will lose your market opportunity. It is now a must-have to make sure everything downstream matches your intent.
Kroll: One of the problems in ESL is the models are not of sufficient quality to estimate what’s going on in the technology later on. What is the power consumption, what is the area and what is the cost factor for the models. That’s the problem in the commercial market. You can’t make the tools broad enough and still use them for a specific task in an individual company for a specific application.
LPD: Does the number of models increase for each node?
Sanguinetti: That’s the way abstraction goes. We recognized we need a higher level of abstraction in ESL for the past 15 years, but it’s taken a long time to get to any kind of agreement about what it looks like. There are a number of approaches to it, which are largely overlapping. That’s what happened with SystemC. It took a long time before it became a standard. TLM is the same. There were a lot of ways of abstracting out interfaces. That’s why there are a lot of different models.
Kroll: They just specified TLM 2.0 last year.
Ng: As more folks come to adopt ESL and more is flushed out, this is part of a natural maturation of the process.
Buric: As this happens, you’re also going to see the rise of a second generation of ASIC vendors—eSilicon, Open-Silicon, Global Unichip. And due to design complexity, there will be an ESL signoff between design intent and design implementation if you don’t have a clear signoff.
LPD: Can ESL handle all the power islands that are cropping up in designs?
Kroll: I don’t think so. There are people already doing power analysis with SystemC and TLM 2.0. They are switching off components and making sure the dynamic power is captured properly. There needs to be more standardization on how to do it.
Sanguinetti: ESL can do anything. You can write lower-level code in your higher-level environment. It’s a bogus complain about ESL that you can’t do this or that. Maybe it’s as much effort to do it if you weren’t using ESL, but that doesn’t mean it can’t be done. In some cases, ESL hasn’t made your life easier—yet. But that’s where continuing work will be done.
LPD: Such as?
Sanguinetti: Power islands. Right now you put a module in one domain, another module in another domain, synthesize them and hook them up with an interface. Writing that abstract interface is work.
Buric: It becomes structural ESL. You have to partition the problem.
Sanguinetti: You want to be able to write code at a higher level, and what you take advantage of is the language’s expressability and the tools’ capability.
LPD: As we get to the next several nodes we’re facing restrictive design rules. Does that limit the demand for ESL?
Ng: Now that the flow is a connected flow, through RTL to physical, more and more people will be adopting it. They don’t have a choice. The challenge is creating a really good representation of the physical design at such an abstracted level. Even developing a synthesizer to put out good RTL good, dealing with the different interpreters and synthesis tools, is a challenge when it comes to power. Connecting that physical implementation is to the highly abstracted algorithmic world is difficult.
LPD: How do changes at the smaller geometries affect RTL?
Ng: From an RTL standpoint, not much. But it does affect the implementation of gates. We’re getting to the point where we will have to question what physical structures are allowed. We’re not saying they’re bad or good. But it will be a fairly limited set. From that to gate-level implementations that leverage those changes. I don’t know how you can comprehend that in a model. It’s more constraints on the physical implementation.
Leave a Reply