Systems & Design
SPONSOR BLOG

Increasing Certainty For 20nm Design

System-level design has become vital for dealing with complexity at the leading edge.

popularity

By Frank Schirrmeister
At the recent Design Automation Conference two topics were getting very special attention: Design at 20nm and System-Level Design. This is very indicative of the very opposite trends we have been facing in semiconductor designs for the last couple of decades. On the one hand, the actual design units get smaller and smaller, and we are today happily designing for technology nodes we thought would be impossible 10 to 15 years ago. On the other hand, the overall designs get bigger and bigger, are becoming more complex, and functionality is determined these days as a combination of hardware and software running on complex processors.

In the run-up to DAC 2012 I helped organize a couple of events, including an “Experts at the Table” press roundtable for Ed Sperling and Ann Mutschler. The topic of this roundtable was “Does 20nm Break System-Level Design?” The big question when discussing this with Ed and Ann had been to figure out whether the classic ecosystem for chip design will have to change, so which industry colleagues to invite became critical. The list of invitees had to be very deliberately chosen to reflect what I think will have to be a set of additions to the classic semiconductor ecosystem. Of course the classic ecosystem partners at 20nm and below will include a set of partners enabling designers to optimize their designs with qualified IP, design tools and methodologies, including flows for innovative patterning technology. Lithography techniques must change at the 20nm node to surmount inherent resolution challenges. But this is only addressing the first issue mentioned above, i.e., the design units getting smaller. The complexity issue and its associated trends towards software work exactly in the other direction—raising such questions as how to assemble blocks, predict how they will operate together, and represent them in a fashion to enable repeatable flows.

In my discussions with customers a couple of items stuck out. Assembly and interconnect of the different block—either re-used or newly developed—becomes a critical item. To address that we invited Drew Wingard, chief technology officer at Sonics. Of course the semiconductor technology itself  is critical, so we invited Kelvin Low, deputy director of product marketing at Global Foundries. The next item, which is often a topic of discussions with my customers, is what they refer to as golden RTL prototype, i.e., a consistent and complete description in RTL to work from.  For that topic Mike Gianfagna, vice president of marketing at Atrenta, seemed like the ideal candidate. Finally I sent myself to the roundtable, as well, given that I am responsible for product marketing for system development in the system and software realization group at Cadence and I’m working to address system-level tools and the effects software.

The resulting write-up (part 1, part 2 and part 3) provides a great summary of what we discussed. When asked specifically whether 20nm will break system-level design I did refer to software being a fundamental effect to be considered. But, having been in system-level design for the better part of two decades, I was somewhat cautious with a “it will break” prediction and instead argued that the risk of not doing system-level design will become unreasonably high. So much for being direct.:) And, of course, with the already existing TSMC ESL reference flows it is clear that the system-level tools themselves are pretty much ready to use annotations from silicon implementation all the way back up into transaction-level models.

Uncertainty during a Chip Design Project

Uncertainty during a Chip Design Project

However, it was also nice to see Qualcomm use a very similar argument during DAC 2012 at a panel I helped organize. During the panel on System-Level models—Richard Goering did a write up here—for which Brian Bailey and I had posed the question whether “Does One Model Fit All Use Models,” Richard Higgins, senior director at Qualcomm, used the graphic in this post. Richard leads Qualcomm’s Virtual Platform/Architecture team responsible for the creation, deployment and support of virtual platforms and architecture analysis models. Richard is a member of the original software development team that delivered the first commercial CDMA cellular phones to the industry, so he has keen insights and experiences.

The graphic he used outlined a way of thinking about the project flow based on uncertainty. How certain can you be to have the right chip and system architecture? How certain do users have to be to trigger a chip tapeout and commit to the mask set? This is where system-level design comes in, and Richard described how Qualcomm today uses system models pre-silicon as predictors of hardware design to rapidly converge product uncertainty, and how accurate prediction requires model validation for confirmation of analysis results and user confidence. In the future, Qualcomm would like to see even earlier system trade studies and broader use case evaluations for even more rapid convergence. Richard went on to describe the challenge for system-level design, which is that complete use case evaluation requires the efficient re-use of validated models of different specification and technology. Multiple model types are used for different phases within the same use case, and Qualcomm would need virtual platform capabilities to enable architecture analysis, as well as software execution to evaluate all facets of the use cases.

So how does all this change at 20nm? Does system-level design break? Well, Richard in a sense confirmed what I had mentioned in the round table. System-level design may not actually break, but at 20nm the certainty curves again look different and without system-level design they will converge even slower. And of course the ecosystem to enable 20nm design will not only be focused on the implementation effects, but will have to include players dealing with the design effects caused by complexity at the system-level.

We design in fascinating times!

—Frank Schirrmeister is group director for product marketing of the System Development Suite at Cadence.



Leave a Reply


(Note: This name will be displayed publicly)