Too Many Standards, But Still Not Enough

popularity

By Ed Sperling
The semiconductor industry has been one of the most prolific sectors in history when it comes to generating standards. Talk to any design engineer facing time-to-market pressures, new packaging approaches, and a mindboggling number of merchant IP, subsystems and interface requirements, and you’ll hear a compelling pitch for new standards. Talk to his or her boss and you’ll probably get an earful about how there are too many standards that need to be supported.

The truth is they’re both right. There are too many old standards and not enough new ones. Where technology converges and uncertainty adds risk, standards are considered essential. They improve time to market, help get multiple companies speaking the same language and moving in the same direction, and they can bring enormous cost savings.

But when standards are no longer needed, they tend to stick around forever—like space junk. Standards need to be maintained and updated. But the work of updating standards, which includes reassessing them regularly and combining them with other standards when it makes sense, is even less glamorous than developing the standards in the first place, and certainly more tedious. Moreover, there is even less direct economic benefit to developing those updates.

The result is that the industry is littered with old standards while the din rises for even more. Scott McGregor, president and CEO of Broadcom, said his company is involved in about 100 standards efforts at any one time.

“Standards need to evolve, and new standards drive innovation in larger markets. “But standards also need to go away when it makes sense.”

He’s not alone in that viewpoint. John Goodenough, vice president of design technology and automation at ARM, said the industry needs to “keep collapsing standards down.”

Standards to ease pain
Wherever there is a pain point—particularly one where multiple vendors are involved—there is discussion about standards. Sometimes it’s a way of slowing down a market leader. Sometimes it’s a way of slowing down everyone else who follows the market leader. But in all cases, it requires an almost superhuman commitment to negotiate an outcome because each company has its own agenda, and standards in many ways are a compromise.

“That’s why standards happen at the edges of the network,” said Charlie Janac, president and CEO of Arteris. “We’ve got standards like AXI (Advanced eXtensible Interface) and OCP (Open Core Protocol). And there will be new standards as we move from 2D to 3D, but those are just being established. The goal is that customers shouldn’t have to care about what they use. It should all just work.”

But getting things to work also requires a lot of translation, which is the really hard stuff in developing standards. Drew Wingard, CTO of Sonics, said the most effective standards are ones that allow engineers to work with their own terminology and still provide useful information to other groups using different terminology and data.

“The folks worrying about video use a different number than the people who are worrying about graphics processor performance,” said Wingard. “The best thing we can do is keep it at that level. But asking one group, like the architects of a subsystem, to adopt my vocabulary, is counterproductive. A better way is to come up with a simple language.”

That’s easier said than done, of course. Ask anyone about power formats these days and you’re likely to evoke a sour look. UPF 1.0, IEEE 1801 and CPF are all standards, but they don’t work together. There has been a big improvement in cross-standard functionality, thanks largely to the efforts of Cadence, Mentor Graphics and Synopsys, and there are now cheat sheets about how to read one versus the other. But the hard work now under way is to bridge those two with a Rosetta Stone type of translation.

While the existence of multiple power format standards still rankles customers—many of whom are quite vocal about it because they use multiple vendors’ tools and IP, which favor one format over the other. But at least the problem is being addressed, and it has served as a warning against developing standards prematurely—or without all the essential players involved in the planning process.

Works in progress
This hesitancy to put a stake in the ground for standards is particularly evident in the 3D stacking arena. Si2 and Accellera have spent the past couple years just watching the process, trying to figure out where standards will be best served.

So far, these efforts are more general than specific, as companies attempt to narrow down what will be effective. Dennis Brophy, vice chairman of Accellera, said the real drivers of these efforts are time-to-market pressures and more complicated, larger systems.

“You clearly can’t start from scratch, so you need to re-use IP,” Brophy said. “That should lead to a more reliable design and quicker verification. But you also have to catalog and store these IP blocks.”

Accellera has puts a stake in the ground for system-level IP integration—work is underway to significantly improve IP XACT. Sonics’ Wingard said what’s really needed is a way of describing the IP that companies are being asked to integrate.
“The days when you spent more money integrating IP than in buying it are over. We expect it to be a black box.”

Accellera also is is pushing for UVM to be part of the system-level verification flow. This is easier said than done, because companies are still investing heavily in VMM and OVM, the verification methodologies that UVM is supposed to supersede. Accellera also is examining what standards will be necessary in software so there is some sort of bridge between SystemC, analog/mixed signal, and system Verilog.

Analog/mixed signal and 3D
Analog is a particularly thorny subject when it comes to standards. The sheer complexity of the problems being solved has surpassed the ability of analog designers to do everything manually, requiring far more automation than in the past. In addition, with stacked die looming in the future, a consistent way of writing analog is now required because the analog will probably reside in a separate subsystem or on a separate die that must be integrated with other die.

“This has to be a black box so it can be sold and integrated,” said Simon Butler, CEO of Methodics. “But how do you prove that it works when you get that block? You need a standard way to test it.”

He said that IP-XACT will address some of those concerns with digital IP for a consistent way of creating testbenches and defining what’s in an IP block. Analog is another story entirely.

“In 3D, there will be dependencies created,” he noted. “We need to add context into all of this.”

The road ahead
Si2 has plotted a number of standards it plans to work on in 2012. Topping the list are the following:

  1. OAC: New release of OpenAccess to include scratch designs and other functionality and performance enhancements.
  2. DFMC: OpenDFM 2.x will include DRC+ and other enhancements, while OPEX 2.x will include open parasitic extraction parameters and OpenLVS
  3. LPC: Updated power modeling standards to support handling power intent and verification for large IP blocks
  4. OpenPDK: New OPS 1.0, the Open Process Specification, will include a symbol standard, a design parameter standard, and a callback standard, and all other design parameters. In addition, all work started in 2011 will be completed.
  5. Open3D: Standards are expected to be released to address definition of the power distribution network across the die of a 3D stack; thermal design and analysis of an entire 3D stack, including thermal constraints between neighboring dies; and expression of design constraints into and out of the path-finding and floor-planning stages of the overall design process. All work started in 2011 will be completed.

The road behind
Getting rid of the old standards, or at least collapsing them and making them more useful, is a subject no one wants to talk about. But venture capitalist Jim Hogan did have an interesting observation about just how long standards stick around.

At a recent Synopsys interoperability forum, Hogan noted that Roman roads were constructed exactly 47 inches wide to accommodate two horses used to pull a chariot. He said the distance between rails is the same distance, and the seat in his car is exactly 23.5 inches wide.

So far, no one has seen a need to adjust that number.