Methodology Shifts Ahead

SoC creation now focuses on integration and analysis rather than block implementation first, analysis second.

popularity

By Pallab Chatterjee

The high cost of SoC development at advanced process nodes is forcing a significant shift in many of the methodologies used in design.

Hierarchical design methods are giving way to IP integration and hierarchical analysis at the architectural and functional design levels. Previously, large blocks were implemented at the top level of the chip and the analysis was pushed off until these top blocks were done and the chip was checked as a whole. The rising complexity of today’s designs and the ability to interpret results from current EDA tools cannot sustain this approach.

This shift in design tasks has been a major point of discussion at a number of recent industry events. The focal point of integration and analysis was presented formally for discussion by Jim Hogan and Paul McLellan at the recent ICCAD conference, and was amplified through the rest of the ICCAD conference as well as at ARM’s Techcon3, the MEMS Executive Congress and the Low Power Workshop. While the main context of the discussion by Hogan and McLellan was EDA business models and the location of the rapidly dissolving profit margins and value in the design flow, the technical conferences presented panels and papers exemplifying the new focal point and methodology.

The role of the integration phase has pretty much been unchanged since the start of IC design. It is separated into three levels: component/device design, IP/block design and architectural/system design integration. While the breadth of the work at these levels has grown and now includes level-specific analysis, the overall scope of the levels remains fairly unchanged over the past 30 years. The component/device design activity has shifted from being a common task performed by all semiconductor companies to a specialized task performed by just a small portion of the supply chain and specialty semiconductor firms.

The MEMS and Low Power events focused on the base process technology and new application device areas. Both areas, which are currently undergoing double-digit revenue growth, are focused on traditional component-level design and process per device functional performance optimization. The MEMS and low power marketplaces have joined standard product memories and largely shifted out of the modern design ecosystem, requiring them to use the old “IDM style” design flow. Due to a lack of transportable and standard design tools, these markets create custom devices, IP blocks and then final full designs on in-house flows for in-house standard product chips.

There is no functional multi-company IP market in these channels. The primary analysis tools are at the mathematical, mechanical and physics levels rather than transportable in high-level languages. The analysis also is very company- and function-specific rather than standards-based.

Following this trend, the fabs and IDMs that are still in the primitive device creation market have been focusing on creation of customized software tools to support integration and analysis. ICCAD had several papers on NBTI (negative bias temperature instability), SEE (single event error), thermal issues, yield and reliability tools and models created by these device manufacturers to perform fab-line specific-use analysis. These are not general-purpose tools with large target audiences. Instead, they are being created by the fabs and IDMs, in conjunction with the universities, for internal use.

At the ARM event, ARM was the primary transistor-level provider of design knowledge for sub-90nm processes, surrounded by an ecosystem that includes a large number of design partners who could use these transistor-level elements as function blocks. Among them are IP and software companies targeting the next level of hierarchical major design activity, which includes analyzing and optimizing these IP blocks. Correspondingly, the technical sessions were no longer focused on the creation and use of the IP in a technology node, but on the integration and interoperability of the IP blocks to implement functions.

Follow the money
The biggest shift in the design trend is that most of the integration and analysis is solely a hardware task, even though the largest development portion for an SOC is the application software. This effort is currently both the largest segment of the development cost and the largest manpower allocation on a project. The software is implemented at multiple levels from microcode to control in-hardware state machines and embedded processors/controllers to standard interface control firmware (such as for DDR3 memory control) to higher-level code such as ECC, operating systems, GUIs, and in-system applications.

The software requires co-verification and iteration of the logic hardware, possibly the IP selected and the software/firmware. This activity is now performed in high-level languages, which are typically many orders of abstraction above the mathematics and physics level issues of the manufacturing process. As a result, there are many traps for the creation of high-level SoC systems that are physically realizable and yieldable.

The biggest challenges facing cost-effective yield in new SOC designs aren’t necessarily the lithographic process or the actual wafer fab. More important is that the designs are being created at high levels of abstraction without regard for the realities of having sub-wavelength active transistors that need to be manufactured in high volume. The systems designers have been hiding behind the “comfort” of the ESL and high-level EDA tools, and have lost touch with the devices that make up the functions. As a result, there is little regard or respect for the concept of a single chip with billions of devices on it, all of which have to work as planned.

Engineers at the conferences say a lot of the issues are due to the commercial EDA vendors making SoC tools having spent literally decades away from the semiconductor manufacturing and device R&D floor, and creating solutions that produce algorithmically and mathematically valid solutions that also are physically unrealistic and which cannot be implemented. As the vendors do not have a good feel for the validity of solution, so go the solutions from their tools not being valid from an engineering perspective. An example of this gap in understanding is the creation of hardware logic solutions with firmware control, which return a fatal error rate in the 1 part per millions region. Given an optimistic perspective that these errors occur in the 1 part per 100 million range, a 1 billion-plus transistor device would then have more than 10 fatal errors in the design.

The knowledge base of creation, integration and verification will need to be re-unified to address this issue. Only then can the industry reverse the point tool segregation of the problem that has been promoted by the EDA industry for the past 20 years so that modern SoC design once again will not be the domain of just a few semiconductor companies.



Leave a Reply


(Note: This name will be displayed publicly)