Experts At The Table: ESL And Low Power

Last of three parts: Software, IP re-use and the role of startups.

popularity

Low-Power Design sat down with Walter Ng, senior director of platform alliances at Chartered Semiconductor; Brani Buric, executive vice president of sales and marketing at Virage Logic; John Sanguinetti, CTO at Forte Design Systems and Andrea Kroll vice president of marketing and business development at JEDA Technologies. What follows are excerpts of that discussion.

By Ed Sperling

LPD: How much does software affect the overall system architecture?
Buric: It can affect it quite a bit. If you look at Linux, that is a growing standard worldwide. There are more than 10 suppliers, so their target market is fairly limited and there is no common spec. So they overdesign to make specs for different target markets. Those are areas where if you do everything from a system-level design, you can save a lot because you can optimize where to share implementations.
Ng: One of the problems we see is the legacy mindset. Because the time-to-market pressures are so great, most companies don’t have the time to go back and re-do their design infrastructure. If your design teams are designing the next technology, they’re more than likely leveraging their previous designs. But because that wasn’t captured at the system level, it’s hard to go back and get an optimal implementation. You’re fighting to get this chip out the door and generate revenue. It’s not a matter anymore of trying to find the optimal implementation. And because of resource constraints, nobody has the luxury of having one team doing the implementation of the next design and having another team off to the side doing the more strategic infrastructure and maybe generating models for some of the existing IP that’s already been proven.
Sanguinetti: No one wants to take something that works and write a higher-level model of it.
Kroll: We do see that happening more often, though. People do see the value of the models and being able to integrate that into the whole system. More people are investing in high-quality models, verifying them upfront and making sure they represent the hardware. You can annotate technical information from downstream back to the model so that you really can do the verification or validation of the chip complete at the beginning. Then you have the process and also the possibility to say, ‘I have an RTL model of this. Does it still match the specification?’ Or if the software running on an FPGA doesn’t work you can go back to the model and match the software to the model and find the bug.
Ng: If the industry shifts to where designs get outsourced to the fabless ASIC guys—and from a business need that has to be there—then it might accelerate all of this. But it’s very tough for a newer company that doesn’t have a lot of legacy stuff.
Sanguinetti: That’s absolutely right. It’s the legacy stuff that gets left behind. For new code and new implementations, there’s a tremendous amount of value in ESL with faster time to market and better quality RTL. What we hear from our Japanese customers, after two generations of using this, the top thing they get out of it is IP re-usability. This was a revelation to them and to us. The real value of ESL is IP re-usability.
Buric: This trend will happen. About 20% to 25% of fabless startups today have, as part of their business plan, where they create a working component for something and then to sell it to someone who has a more complex chip. That is one trend. Another trend is the big companies are selling components like Bluetooth into even more complex designs for big bucks. You need a consistent way to transfer that knowledge. This trend of building bottom-up models will be there.

LPD: Does this trend toward a higher abstraction layer open the door for more startups, or does it close the door to them?
Kroll: I think it will open the door for more startups. If you can develop a component and it can be integrated in a chip. People can develop a technology or an FPGA prototype and sell it to a bigger company.
Ng: We have a couple customers. One we’ve supported on a 65nm process. They went out of the chip business and they’ve sold their designs. Now they handoff their design, which gets integrated into their customers’ larger design. In effect, we’ve lost a customer. Instead of selling the chip, they’re selling the design. I’ve seen a few instances of that.
LPD: There are a lot of companies that have their own homegrown tools. Does complexity make those tools less valuable over time?
Buric: If you think about the way the industry has grown up, there are two directions. Established companies figure out ways solve a number of problems specific to them. Equally important, a large number of startups have experience with individual problems and develop tools to solve those. There is a lot of internal knowledge captured in the early phases of design, and I believe there will be a lot of startups based upon that knowledge.
Sanguinetti: I agree with that. The fundamental dynamic of EDA involves people who had a problem, they leave the company and form a startup because they think they have an idea of how to solve it. They develop a solution to the problem and then get acquired by the consolidators. I think that will continue. The only fly in the ointment is it’s becoming harder to solve a problem in an EDA context, and that’s becoming increasingly necessary. As the consolidators have gotten bigger and have more influence they put roadblocks in the way of little companies. System Verilog is a killer for someone who wants to support RTL’s input. They have to import System Verilog test bench code as input. They have to spend man-years on it, whereas 15 years ago doing it in Verilog wasn’t too hard. Things have gotten harder for the little guys.

LPD: They’ve gotten harder for the big guys, too.
Kroll: One of the thing that needs solving is where do the models come from, who can use them, and how to improve success with them as a vehicle to solve problems. Currently, the big guys have the manpower to build those models. Maybe everyone needs to pitch in and have the models available. TLM 2.0 is one of the bigger opportunities out there. Right now there are IP vendors developing models, the semiconductor companies are also developing models and then the ESL companies are developing models.
Buric: Once we have enough models out there it will be chaos, and then there will be standards.
Ng: There are a lot of proprietary systems out there and it’s easy to develop models for those within that context. If at some point those systems run out of gas, you can have a wider solution and more standardization and collaboration.
Buric: Another problem is that certain groups create obstacles. They don’t want to adopt a common model. This will continue for awhile. If I create something in silicon and it fails, that’s my problem. For that reason, there will be constructive paranoia. Failure costs a lot. On advanced process chips, you can’t fail.



Leave a Reply


(Note: This name will be displayed publicly)