Version Control

More IP, multiple power formats and cascading standards are slowing down progress.

popularity

By Ed Sperling & Ann Steffora Mutschler
One of the biggest impediments to progress in semiconductor design is progress itself—version after version of specifications, formats and increasingly IP. In fact, there are so many different versions, some of which conflict directly with each other, that it may take months or even years before some customers adopt new products.

Much has changed since the days when Intel owned much of the semiconductor market and locked down versions of software used to create each rev of a chip. Tools are no longer developed specifically for a single vendor’s chips, and most large chipmakers make far more than one chip every couple years. Still, they do continue to lock down the software used to develop each chip—particularly after it reaches a certain stage of the development.

“From a tool point of view, most customers still lock a configuration down for a chip,” said Mike Gianfagna, vice president of corporate marketing at Atrenta. “From a support point of view, it’s much easier because we can support both version 4 and version 5 of a product. But it’s frustrating for us because we’ll roll out the latest version that is faster and with fewer bugs and the customer may not adopt it for six months because they’re locked on a previous version.”

IP is equally challenging. As the semiconductor industry increasingly embraces third-party IP and the concept of re-usability, it is running into many of the same version control issues that EDA tools have been wrestling with. That typically means IP vendors have to support multiple versions of IP—sometimes hundreds of different builds and configurations.

One solution is what Martin Lund, senior vice president of R&D for Cadence’s SoC Realization Group, describes as mass customizable standard IP. The idea is that licensees have control over part of the IP’s configuration, which will help eliminate some of the version control issues.

“The goal is to build for change,” said Lund. “You want to build a block and re-use it over and over again, but that’s almost never the case today. So how do you accomplish that? You have to build a superset capability in the core. Then you generate a soft set of just what you need and verify it. IP today uses a tools approach. You have to build IP that can be generated.”

A second approach is to do these kinds of changes inside of subsystems, so the interconnects and characterization looks much the same from the outside, but the changes occur on the inside. Because most commercial IP is a black box, licensees rely heavily on the IP vendors for characterization. But when too many IP blocks are used that can create problems. Subsystems take much of the guesswork out of that approach by internalizing the changes and keeping the external characterization relatively constant. As a result, the subsystems are pre-verified and pre-configured, requiring only system-level verification.

“With subsystems, the software is programmable and the hardware is configurable,” said Kurt Shuler, vice president of marketing at Arteris. “Depending on how well you integrate that software and the configurability is important, though.”

An architecture that provides flexibility can help, as well. One of the reasons network on chip IP has become so popular is that it minimizes the impact of these kinds of version issues by isolating the blocks.

The good and bad of standards
Standards can help in this arena, but they also can hurt. The most egregious and well-publicized example of what can go wrong here is in the world of power formats, where EDA vendors, their customers and standards bodies have been working for the past several years to sort out a five-year-old dispute between Cadence, on one side, and Mentor Graphics and Synopsys on the other. Privately, they’re still cursing this division, even though the latest UPF 2.1, alternately known as IEEE 1801, has bridged most of the differences. For many companies, it will be months or years before they can adopt the changes.

But while the frustration is palpable, it’s hard to figure out exactly where to place the blame because standards are the work of multiple entities. Engineers have learned to use the tools they have, and even improvements can cause disruption.

“As part of the UPF 2.1 specification, there are some deprecated commands,” said Mary Ann White, director for Galaxy implementation platform marketing at Synopsys. “We had more or less voted against doing that because we know customers use these commands, so to sit here and say we understand that from a new design perspective, maybe you don’t want to use a deprecated command. What are you going to do with the people who have been designing and using those commands? Are you just going to say they can’t bring their old IP for reuse because it has those commands in it? We are committed to continuing to support those commands because for us it’s hard to phase out any feature but to phase out features of specific commands that are part of the standard is not something that we could do without a lot of uproar.”

This follows the same trend as software. Maintaining backward compatibility has always been a big issue in software, but there is a long legacy of complaints about dropped features and degradation of performance. After decades of improvement in processor technology, even new versions of some popular software applications actually run more slowly than older generations when there was less legacy software to support.

“With EDA tools and IP, you have to be a little bit more thoughtful, taking into account both new invention as well as industry consolidation,” said Chris Rowen, chief technology officer at Tensilica. “This is an issue for IP, which is why we put an enormous amount of effort in previous versions. We support back as far as 10 years, and you have to accept that in this business. A customer has good reasons why they don’t want to change IP, but at the same time everyone is constantly innovating everything from bus interfaces to EDA formats to power formats. It’s a tax on all engineers, and it’s the glory and bane of being an established provider. That’s part of the reason startups appear to move so fast.”

Yatin Trivedi, director of standards and interoperability programs at Synopsys has a similar view: “That’s where the maturity of the full solution plays in because those who are looking at just a small part of the overall design flow and say, ‘Oh, I can solve this problem and here are the newest whiz-bang tools to build that,’ they are not looking at the overall problem; they are only looking at a very niche, small area…but that’s not what most of the project managers and design managers are responsible for. They are responsible for delivering a complete solution.”

Conclusions
There are no simple answers for how to manage different versions of IP or EDA tools. When it comes to standards, the lesson learned from power formats is that rushing them out the door without industry buy-in can create long-lasting problems. This is one of the reasons that standards group Si2 has been so reluctant to rush into stacked die standards, despite pressure from some of the leading-edge companies.

Because of the growing emphasis on a virtual IDM ecosystem, as well, restrictions can come from multiple places. Subramani Kengeri, vice president at GlobalFoundries, said the foundries are being forced to put restrictions on third-party IP because reliability “is a scary thing.”

Most vendors can recite horror stories of IP that didn’t work as planned, and most IP vendors can recount stories where they were blamed when IP was used in ways it wasn’t expected to be used. This is one of the reasons version control is so critical, but it’s also one of the checklist items that is most likely to be abused, ignored or just left unchecked.



Leave a Reply


(Note: This name will be displayed publicly)