The Road Ahead For 2014: Tools

New approaches, higher levels of abstraction, increased integration and more market consolidation will define the semiconductor industry this year.

popularity

In the third and final part of this predictions series we see the natural conclusion of market shifts that are driving changes in semiconductors, and which in turn drive the tools and IP needed to create those systems.

To be expected, the changes fall into a few areas:

  • New tools, techniques and changes required for smaller geometries;
  • A migration to higher-levels of abstraction and the changing verification needs;
  • Increasing integration of technologies, such as analog, digital, sensors etc., and
  • Acquisitions and consolidation.

Each time the fabrication geometry shrinks, additional manufacturing rules apply to keep yield levels acceptable. “These smaller nodes are forcing foundries to find ways to better characterize and model process variations, and also pushing designers to use better Design For Yield (DFY) tools and methodologies to deal with increasing device and design complexities. EDA must respond,” says Zhihong Liu, executive chairman of ProPlus Design Solutions.

But the economics of this market also are changing. “Implementation tools will continue to be challenged by conflicting requirements of technology advances against the shrinking customer base that can afford the costs at these nodes,” adds Bernard Murphy, chief technology officer at Atrenta.

At 20nm, (DP) became essential and the technical hurdles are stretched further with the incorporation of a new type of transistor as we migrate to the 16/14nm node. “The implementation tool needs to take into account interactions between DP constraints and the layout restrictions,” says Joseph Sawicki, vice president and general manager of the Design-to-Silicon Division at Mentor Graphics. “This needs to happen at every stage of the flow including placement, routing and optimization. The extraction engine must be able to accurately model the 3D parasitic requirements of finFETs.”

Dave Noble, vice president of North America operations at Pulsic, sees problems ahead. “Legacy (digital) tools are being ‘updated’ to address finFETs, but they were designed for 65nm/90nm so they are running out of steam. We anticipate that new approaches will be introduced. The evolution of the finFET era necessitates new neural tools that can ‘think’ for themselves and anticipate the required behavior (DRC-correct APR) given a set of inputs (DRC and process rules). Tools will be required that can generate layout, undertake all placement permutations, complete routing for each permutation and ensure that it is DRC correct — in a single iteration.

As the fabrication gets smaller, the size and complexity of systems is increasing. This is causing a reworking at the front end of the design process, where RTL is no longer adequate. Some are migrating to high-level synthesis to gain additional productivity. “The design flow needs to be geared toward creating bug-free RTL designs, says Shawn McCloud, vice president of marketing at Calypto Design. “This can be realized today by automating the generation of RTL from exhaustively verified C-based models.”

The bigger view
At least part of what’s driving change is the unbridled growth in complexity in advanced SoCs.

“The United States is the fastest growing region for SystemC-based HLS usage,” says Brett Cline, vice president of marketing and sales at Forte Design Systems. “This is due in large measure to low-power requirements for complex chip designs for mobile consumer devices.”

But it is the verification of those input models and the rest of the system that is attracting the most attention. “C and SystemC verification today is rudimentary, relying primarily on directed tests,” explains McCloud. “These approaches lack the sophistication that hardware engineers employ at the RTL, including assertions, code coverage, functional coverage, and property-based verification.”

Part of the solution is the creation of a complete system-level model. “2014 will be the year of the virtual prototype,” predicts Bill Neifert, chief technology officer at Carbon Design Systems. “The modern virtual prototype is now used throughout the design cycle to drive early architectural decisions, configure IP and to develop firmware and debug software. This creates a new set of design challenges as prototypes need to find a way to be both fast and accurate depending upon the needs of the end user.”

“The trends in complexity have begun to show signs that they will soon outpace what’s currently available in verification technology and methodology,” says Michael Sanie, senior director of verification marketing at Synopsys. “In 2014, the industry will begin its journey into new levels of verification integration and productivity.”

Some see an extension to existing methodologies solving these issues but , CEO of Breker Verification Systems, disagrees. “We expect an abandonment of the Universal Verification Methodology (UVM) for system-on-chip (SoC) simulation. While the UVM has proven effective for IP blocks and subsystems, its limited vertical reuse and lack of links to embedded processors make it impractical for significant full-chip simulation.”

Others in the industry are at least in partial agreement. “There will be a trend toward software-based analysis and verification of SoCs,” says Atrenta’s Murphy, “largely skipping over traditional testbench-based verification. This will likely spur innovation connecting software use-cases to implementation characteristics such as power and enhanced debug tools to bridge the gap between software and detailed implementation problems.”

Carbon’s Neifert agrees: “Solutions are coming out to automate the creation of software tests to exercise the complete system and to create power vectors to run through an analysis tool. This data is then used to instrument the virtual prototype for future runs.”

Getting formal
Another area of verification that is making inroads into the traditional simulation-based verification involves formal methods. “Growth in formal verification, be it for direct usage or cloaked as static verification solutions, is hard to ignore,” says , president and CEO of OneSpin Solutions. “When we look back, 2014 will be seen as the year of formal verification, as 2009 was the year of emulation.”

The usage of formal methods also is growing geographically. “We see more expert formal adoption in Asia in 2014,” states , president and CEO of Oski Technology. “Asian semiconductor companies have deep concerns about their current verification methodologies and sign-off flow that include subsystem and SoC simulation and emulation. Both have caused numerous project delays and missed bugs after tapeout.”

But formal methods are not trying to be everything to everyone. “To handle giga-scale verification of SoC chips,” explains Graham Bell, vice president of marketing at Real Intent. “I see more specialized tools that focus on just one problem area and thereby enable more effective verification. Because of the narrow design intent, new static solutions will provide the necessary speed and capacity.”

One of the market trends discussed was the growing importance of the (IoT). This is requiring greater levels of integration. “There is an accelerating convergence of disparate functional blocks within electronic systems, resulting in a growth of mixed-signal design starts,” notes Steve Smith, senior director of marketing for mixed signal verification at Synopsys. “Designs increasingly use integrated audio amplifiers, wireless modems, data communication interfaces and so on.”

The verification of mixed-signal designs can be a challenge especially the relatively long time it can take to run full mixed-signal simulation, Smith adds. “During 2014 and beyond, we will see increased use of regression testing for analog/mixed-signal designs, where metric-driven verification techniques will be broadly deployed.”

But the extra integration is not just between digital and analog. “Greater integration of sensors and analog electronics into SoCs is driving the need for system-level modeling of mixed-signal designs with workflows that connect to lower level hardware simulators,” says Ken Karnofsky, senior strategist for signal processing applications at Mathworks. “We also need new workflows for the integrated hardware/software environment.”

Stepping back
At the beginning of last year, many in the industry expected the wave of consolidation to continue. Synopsys had been buying companies at a rapid pace and it was expected that this trend would continue. Predictions for continued consolidation are more tepid predictions for this year. “Since much of the revenue growth in EDA industry is on the IP side of the street, I see further acquisitions by Cadence in that area,” says Real Intent’s Bell.

Oz Levia, vice president of marketing and business development at Jasper Design Automation, expected “EDA will continue to see consolidation, with verification, emulation and SIP growing faster than other segments. Investment levels will continue to be low although large EDA vendors will make increasing investment in SIP, and verification technologies.”

In December, Semiconductor Engineering will look back on all of the predictions and see who turned out to be right and which events were not foreseen by our industry experts.



Leave a Reply


(Note: This name will be displayed publicly)