Experts At The Table: Evolving Standards

Last of three parts: Who’s driving the standards, who benefits and what’s the effect of competing standards?

popularity

System-Level Design sat down with Keith Barkley, senior engineer in IBM’s systems and technology group; Steven Schulz, president and CEO of Silicon Integration Initiative (Si2); Yatin Trivedi, director of standards and interoperability programs at Synopsys; Ian Mackintosh, chairman of the OCP International Partnership (OCP-IP), and Michael Meredith, vice president of technical marketing at Forte Design Systems. What follows are excerpts of that conversation.

By Ed Sperling

SLD: How much of the standards effort is being driven by the foundries?

Schulz: Let me try to answer that in context. Timing of a standard in the market is a very delicate thing. It depends on when you can get the resources together to really do the job right to define a standard and when the market is ready to adopt it as a standard. You have to do dynamic adapting, which is a case in point with our DFM coalition. Back in 2002 we had heard about the rising mask costs and predictions were that by 2006 we would be unable to mask anything because of the cost. We formed a coalition, thought we would get all the mask makers involved and everyone would win because it was about efficiency. What we learned was the mask makers’ business model was broken. They felt they had very little to gain from efficiency. They had no incentive. The others said the real problem wasn’t masks. We tried to get the foundries engaged at that point but the market wasn’t ready. There was a lot of debate about what DFM was. We had two years to get the right folks together, the right dictionary, and now we are putting together reference flows and standardizing hot spots and meta language and DFM libraries.

Trivedi: That’s part of the need to experiment and learn from mistakes and then do the right thing.

Mackintosh: I think the fundamental problem is that people don’t understand that standards are products. We’ve actually invested a lot at OCP-IP in thinking about this. We’ve created an infrastructure that adds value to the standard. But all the items in the infrastructure were developed from a single concept—we don’t develop anything unless there is a need and greed dynamic. There are a bunch of people working on it because they need it and a bunch of people who say it benefits them financially to get that. That’s why the infrastructure we have is so extensive.

Schulz: We engaged in phases and steps. It was a long set of phases in DFM before the foundries would commit. The first group that said they need it from a greed standpoint was the Common Platform. But even though that was six or seven very large companies, it wasn’t enough. We recently added more very large companies, and then we got other contributions. We got more participation. And then we got questions like, ‘If we get involved, will you support encryption so we can put our secret processes in that?’ We of course said yes. And then the interface became standard and stable because they were all contributing with a high level of trust. But it takes time to build that trust. The foundries started out saying, ‘That’s a design issue. It’s not our problem.’ Then they went to, ‘It’s just the interface.’ But then something else came up that intersected with that, and they said, ‘Well, maybe we should be engaged.’ The foundries are just driving the back-end part, but they’re having more and more influence in the reference flows and they’ll continue to drive that.

SLD: Historically, standards were the lowest common denominator and for the real value you had to go beyond that. Has that changed?

Trivedi: I’m not sure it’s the lowest common denominator. Sometimes standards are enablers. In the upper end of the design spectrum, once we had System Verilog as a standard language, that’s when things like the verification methodology became an issue. People recognize now there’s a standard and it’s stable and you can build your infrastructure around it, so there’s a notion of methodology based on the language. The move from language to methodology isn’t lowest-common denominator. It’s a broader methodology. What’s happening is we are minimizing duplication of effort.

Barkley: A lot of the industry is very reactionary. People tend to react to yesterday’s problem with a standard so you can go forward. I think it has to go both ways. You’re not going to anticipate everything that’s going to happen. On the other hand, you can’t be totally reactionary. When we do advanced processor design we don’t let standards dictate what we do. But obviously we want to take advantage of them. From a high level we look at where we want to go and what’s already in place so we’re not duplicating efforts. We want to focus on the differentiation part. But by participating in these standards we actually learn from other companies. There’s lots of cross-pollination going on, and frankly it’s not entirely clear who our competitors are anymore.

Meredith: Rather than holding down the users of standards to a least-common denominator, I think they provide a stable structure on which everyone can innovate. It’s something they can depend on. You use the pieces of standardization that lift you up, and you innovate above that.

Barkley: We bring stuff, other companies build upon that and bring more stuff to the table, which actually helps us.

Schulz: Part of it is an understanding of where to compete and where to share. Mature industries have a good handle on that so they can promote growth. We are beginning to get near maturity. I don’t think we’re there in either EDA or semiconductor. EDA isn’t as far along as the semiconductor industry, and semiconductors aren’t as far along as the general electronics industry. But I think we are improving, and the standards do not need to be the least common denominator. There does need to be alignment before you can create a standard, but it doesn’t mean the standard is holding anything back.

Meredith: This all reflects an organizational equivalent of the tall, skinny engineer. If, to succeed, the engineer needs to know how to build a microprocessor, write the compiler, write the application software that runs on it, write the RTL that constructs it, lay it out, build the mask and do the lithography, he can’t succeed. Organizationally, companies have the same problem. If they attempt to be the best at all of those things then they lose focus. They find things that are key to the special things they do to deliver value. That requires collaboration, and the collaboration forces standards.

Mackintosh: The industry today is still internally focused. Folks will always do everything themselves unless they believe there is something they can move out and standardize on, and get to that standard in a reasonable amount of time. But the primary motivation is still internally focused.

SLD: There’s also some religious issues involved in standards, right? No matter how they interface, there are still battles over these.

Mackintosh: Yes, but I don’t think you need to resolve those. There are some places where standards do compete, and from a business and/or technical reason, people need to have more than one. They find their own niches and applications and the world works with that.

Schulz: But you always have to have a pulse on connecting with people in all parts of the supply chain. Sometimes there’s no boundary to a standard. If you carve it too small it doesn’t have any value. If you carve it too big, sometimes you’re stepping on competitive toes. And if you time it wrong, it’s not going to be effective even if it’s sized right. When you get enough people saying it’s a thorn in their side and not adding a competitive advantage, that’s the right timing.

Trivedi: Standards represent information. The question is what are you going to do with that information. The standards that are more successful are the ones that are used in a broader context of multiple ways to use that information. For example, HDL by itself is useless. But as soon as you put your design information into it, that standard use allows you to go do synthesis, simulation, timing analysis and power analysis. The standard itself wasn’t interesting until you find a deployment for it because it allows the interoperability among tools. If it is broader, it is better. If it is only one vendor that’s supporting the standard vs. broad support by a number of companies it is less useful.

Schulz: We’re wrestling with power modeling right now. How do you represent a model of power at an architectural level? It’s an open question. The industry needs an answer, but we haven’t figured it out yet. There are implications all the way down to your test strategy and other parts of your physical view in your DFM into your timing. Power is a flow, which makes it a very broad effort.

Barkley: I think many competing standards are going to work themselves out. It’s like Blu-ray and HD. The reason this happens, though, is there are so many smart and innovative people in this field. This is an evolutionary journey. We’re never going to be done. But I also think this levels the playing field a bit. The small companies are benefiting from IBM and TI and Synopsys and Cadence and Mentor Graphics, where they wouldn’t necessarily have access to this information and knowledge if there were no standards.

Mackintosh: Darwinism is particularly relevant to the standards world.

Meredith: But there’s also another way in which we end up with the competition within standards having less impact. Quite often, as we move forward, another abstraction gets put into place, and then the differences between some lower standard don’t matter as much. They can either use one or the other, or both.



Leave a Reply


(Note: This name will be displayed publicly)