Best-In-Class Tools Lead To Best-In-Class Design?

As SoC complexity has grown, so has the task of verifying what’s implement on the chip is what the designer intended.

popularity

Today’s systems on chip (SoC) are deeply complex in new ways. A dozen or so years ago, a state-of-the-art processor such as the Intel Pentium 4 used 42 million transistors, was built on a 180nm process and relied upon discrete chips to handle its system interfaces. Jump forward, and the Intel Xeon Phi processor that Intel introduced in 2012 uses 5 billion transistors and is built on a 22nm process. The chip includes sixty 64-bit x86 cores, L2 cache, 8GB of DDR5 memory, and more. This trend to massive integration is even stronger in the mobile space, where SoCs bring together multi-core computing, communications and entertainment functions on one die.

Intel Xeon Phi: a 60 CPU chip.

It’s no longer possible to design all the subsystems of an SoC from scratch and expect to get the chip out in a reasonable timeframe, so today’s SoCs are complex integrations of new logic, IP blocks brought forward from previous designs, and functional and interface IP licensed in from third parties. Some companies are even using third-party IP to build their system interconnect, on the basis that its communications management support and interfaces to other IP blocks will help get a design out more quickly. In effect, an SoC is a sea of interfaces.

The use of IP enables feature-rich products such as the Galaxy tablet, which is good for the consumer, but brings integration challenges for SoC designers and the verification teams that ensure their designs will work as planned. Individual IP blocks may be as complex as entire SoCs of five years ago, and may have internal clocking and power-management strategies that SoC designers need to be aware of. The integration of these blocks means that clock signals may have to negotiate up to 100 asynchronous clock domains as they cross block interfaces. Similarly, systemic power management strategies may involve coordinating power management within a block and among many blocks.

Managing the verification of such complex systems is challenging. The designs are large, so designers need best-in-class tools with very high capacities. They need to be able to control the rising tide of uncertainty caused by clock signals that cross domains, and power-management strategies that create unknown (X) logic states when they blocks are turned on and off. Most of all, designers need these tools to tackle such problems at the highest level of abstraction possible, to speed up the verification process and stop the issues multiplying and becoming more obscure as the RTL design is decomposed to gates. Clock domain crossing (CDC) tools, engineered to recognize and analyze crossings for problems, are essential to help control the verification complexity involved in tackling a full SoC.

With the right tools, overcoming these issues can mean more than just a problem solved. By applying the right analytics, the design actually can be functionally improved. For example, linting tools that give early indications of potential design-for-test problems help avoid synthesizing untestable logic. Tools that understand the X states that are created at domain boundaries in complex designs, and how they propagate, can help avoid too much optimism about their impact at the RTL level (where X states can hide real issues), and too much pessimism about their impact at the gate level. The result: Better analysis leads to better designs.

The same is true with ‘resetability analysis,’ which considers whether it is necessary to take a reset signal to every node that the design says needs one, or whether it is possible to restrict the distribution of reset signals to a subset of critical nodes. If an early analysis shows that some nodes can do without a reset signal without affecting the logical function, that means less routing and lower power consumption on the final chip. Again, better analysis leads to a better design.

As SoCs have become more complex, so has the task of verifying that what is implemented on chip is what the designer intended. No single verification approach or analysis can deliver the certainty that design teams need to tape out, but a suite of best-in-class tools that each address a particular issue can build that confidence. Applying the tools may also do more than avoid errors. Best-in-class analysis early in the design process can avoid issues propagating and, by highlighting which issues matter, give designers the freedom to create best-in-class designs.


Tags:

Leave a Reply


(Note: This name will be displayed publicly)