System-Level Technology Conversations Shift To Deployment

After years of discussion system-level design finally is ready for prime time; 2011 should prove to be an interesting year.


While much has been achieved to define a system-level design flow, more is still needed. Technology goals vary depending on the perspective of tool providers in terms of what needs to be done to realize the promise of a streamlined tool flow from TLM 2.0 down to GDS II.

To many, 2011 will be an interesting year in the system-level design space as conversations with customers have shifted. “For the last two or three years, the conservation has been, ‘What does it mean to go into high-level? What languages will be used? What technologies?’” said Michal Siwinski, group director for product management in the system realization business unit at Cadence. “It was really a conversation about what system-level design would look like. Now, the conversation is shifting pretty aggressively into what it is, what it means, how to deploy it and how does it work as an integrated environment in a continuum of solutions from specification to manufacturing.”

For Cadence, that has been captured under the umbrella of the TLM to GDS flow. “We’ve made sure that everything we do, from whatever we do as an extension into high levels of abstraction both on the design and verification sides of hardware/software coordination, has to be linked into how the end implementation is done. We have to solve for the inherent discontinuity that exists in the fact that people today do something at the system level, throw it away, and then they start on the implementation side,” Siwinski said.

Similarly, companies such as Mentor Graphics have invested heavily for years in systems design both in R&D and acquisitions, and 2011 will be no different in that area. The company intends to maintain its high level of investment as customers require more design technology to beat their competition, said John Isaac, director of market development for the company’s systems design division.

Mentor also will invest in increasing the productivity of its core PCB systems design and analysis capabilities. “We understand that product design requires collaboration and effectiveness in many disciplines: IC/package/FPGA design, mechanical design and analysis, the smooth transition from design into manufacturing and the optimization of the manufacturing floor. We will invest in continuing to improve the entire product development process,” he said.

For others like network-on-chip provider Arteris, 2011 signals a focus on relationships. “We’re working on really tight integration with FPGA vendors and emulation vendors. We’ve done a lot of work with EVE and our common customers to make it really easy to take a huge ASIC SoC design and easily ‘port’ it over to a multi-FPGA emulation server (Zebu server). We’ve also worked with ENSTA (a French university) to push the envelope on network-on-chip size/complexity on FPGAs,” explained Kurt Shuler, director of marketing at Arteris.

Arteris is also working to make it easier to “plug and play” IP. The interfaces are there, but IP-XACT descriptions haven’t been detailed enough. Moreover, the customer must be involved since they all do the descriptions differently. In this regard, the company is working with Duolog and joint customers to be able to easily stitch or weave together IP for an SoC.

Further, Arteris is working on tighter integration with Synopsys’ broad system-level product line, which includes key technology from the company’s Virtio and CoWare acquisitions. Shuler noted that Synopsys is an investor in Arteris and the two have many joint customers.

For Synopsys, customer requests fall under two main categories: individual tool improvements in the five areas of system-level design (algorithm design, high-level synthesis, architecture design, virtual platforms and processor design); and flows, said Frank Schirrmeister, director of product marketing for system level solutions.

In algorithm design, Synopsys’ products have been in the market for 20+ years, noted Johannes Stahl, director of marketing for system-level solutions. “So you could say it’s a boring market, but actually it’s not boring because what customers are doing with the tools these days is super complex with hundreds of thousands of lines of C code, and models that they simulate mostly for complex wireless design. Our task list is very simple: Keep the capacity up so the simulation engine improves the speed, and improve turnaround times for simulation.”

Stahl pointed out that the most exciting piece of technology in the high-level synthesis space came with the acquisition of Synfora. “What we see from customers is that they’ve all done high-level synthesis in one way or another, but for many of these customers they’ve started at too low a level, based on the technology restrictions, so the earlier we synthesize from a very high level the more we can save in terms of verification time—which is what high-level synthesis is fundamentally about. If you start very high, you save the most. You also save the most in terms of not introducing any bugs because you don’t code at a detail level.”

Then, on the architecture side, users are asking for as many architecture analysis capabilities and models available out-of-the-box as possible. What that means in the world of architecture is that a lot of it today is interconnect-driven. As Arteris’ Shuler noted, the two companies are talking about how to connect their technologies. Synopsys also works with industry leader ARM on its NIC-301 fabric to connect things, Synopsys’ Schirrmeister said, “and we have tools to support that from the architecture analysis side. That’s a whole question of how the software splits, how the bus bandwidth, how the latencies are done. We need to make sure our tools interact with other vendors.”

On the software development/virtual prototyping side, customers are asking for an easy way to create virtual platforms and to make sure the use of the virtual platform as versatile as possible. Synopsys acknowledges that it all comes back to the models underneath, so it will work to make sure that the relevant models are available out of the box in its libraries, in application-specific areas such as automotive, consumer, wireless, he continued.

Then, in the area of flows, Synopsys is looking at how individual product components fit together, how flows interoperate with other vendors and with other tools and methodologies both before they think about system level design (pure software) and then after they’ve done system level design in hardware implementation, Schirrmeister added.

From Atrenta’s perspective, the company will be focusing on IP quality and integration readiness in 2011. “More and more IP will be delivered in synthesizable form. How do customers know what they’re getting, and how can they determine, up front, how all this IP will work together?” asked Mike Gianfagna, vice president of marketing. “Second, we’ll be working on new tools and methods to help assemble IP to create the correct architecture. Knowing you’ve got it right as early as possible will make a huge difference when it comes to implementation and embedded software development.”

Finally, with the big push toward through-silicon-vias (TSVs) and 3D packaging, a lot of activity is expected in this area, added Prasad Subramaniam, vice president of design technology at eSilicon. Other areas on eSilicon’s radar for 2011 are multicore designs and power management. “For multicore designs, how can one come up with the best architecture that will optimize power and performance? Power management continues to be a significant area all the way from architecture to implementation. How can a designer make tradeoffs in determining the appropriate power management technique?”

Leave a Reply

(Note: This name will be displayed publicly)