Industry Scorecard For 2014

Second of two parts: It is easy to make predictions, but the best in the industry have a track record of getting things spot on. How well did those 2014 predictions turn out and who was closest to the mark?

popularity

At the end of last year, Semiconductor Engineering asked the industry about the developments they expected to see in 2014. If you care to refresh your memory, they were categorized under markets, semiconductors and development tools. Now it is time to look back and see how accurate those predictions were and where they fell short. Part one addressed the market and semiconductor areas and in this part we examine tools and flow. Some of these changes are required by the new geometries and other by the complexity of the devices those geometries enable.

Starting with the geometries, Zhihong Liu, executive chairman for ProPlus Design Solutions had said that “smaller nodes are forcing foundries to find ways to better characterize and model process variations, and also pushing designers to use better Design For Yield (DFY) tools and methodologies to deal with increasing device and design complexities.” Looking back over the year’s progress Liu states that “the semiconductor industry witnessed increasing interactions between the foundry, fabless and EDA vendors to deal with yield problems related to process variations.” In terms of tools, Liu notes that new tools or features included modeling of process variation effects, design for yield (DFY) tools with fast Monte Carlo and fast process, voltage and temperature (PVT) simulation technologies to trade-off yield and power, performance and area (PPA).

There was increasing concern about who would pay for all of the new research and development required for the smaller nodes. So far, the economics appears to be working but those who invested the dollars have yet to see any significant paybacks.

Sawicki notes that they “have qualified tools and design rule decks in place for 16/14nm at all the major foundries and customers as well as incorporation of coloring () into the entire design flow.” But the investment does not stop there. “While double patterning is well in hand, we are hard at work on the complexities of triple (TP) and quadruple patterning (QP), which may be needed at 10nm and beyond if EUV does not pan out. TP and QP pose some interesting challenges because the design rules associated with these approaches require exponentially increasing computational power as the number of polygons in the layout grows.”

Many tool predictions focused on Verification and 2014 certainly was a big year. “2014 will be the year of the ,” predicted , chief technology officer at Carbon Design Systems. “We’ve seen tremendous growth both in the number of virtual prototype users and in the use cases for which they are being applied.” Use case has grown from architectural optimization and firmware development and now includes system-level performance optimization, according to Neifert.

Michael Sanie, senior director of verification marketing at Synopsys saw that “trends in complexity have begun to show signs that they will soon outpace what’s currently available in verification technology and methodology,” and this has been exacerbated by the increased use-cases for the virtual prototype. “SoC verification teams continue to be driven by not only reducing functional bugs, but also by how early they can achieve software bring-up for the SoCs.” Because of this Sanie sees the need for “integrated solutions that combine verification planning, simulation, static and , verification closure, verification IP, and FPGA-based prototyping.”

At the beginning of the year, Shawn McCloud, vice president of marketing at Calypto, identified RTL verification as a bottleneck in the process and said “the design flow needs to be geared toward creating bug-free RTL designs.” By this he meant that this can be realized today by automating the generation of RTL from exhaustively verified C-based models and these execute 1,000x–10,000x faster than RTL code, providing better coverage. Looking back, McCloud says he was too early. “C and SystemC verification today is rudimentary, relying primarily on directed tests. These approaches lack the sophistication that hardware engineers employ at the RTL, including assertions, code coverage, functional coverage, and property-based verification. For a dependable HLS flow, you need to have a very robust verification methodology, and you need metrics and visibility.”

One area that saw major action was formal verification. David Kelf, vice president of marketing for OneSpin Solutions notes that “with Cadence’s $170M acquisition of Jasper, it was clear that at least one major EDA company realized the importance of the technology enough to make a significant investment.” At the beginning of the year, Oski Technology had predicted a broadening of formal adoption into Asia. Jin Zhang, senior director of marketing for Oski respond that “over a third of Oski’s 2014 revenue came from Asian accounts, rising from less than 1/10 from years 2012 and 2013.”

Another area that was expected to see a lot of activity was and , chief executive officer of Breker made a bold claim that “we expect an abandonment of the Universal Verification Methodology (UVM) for system-on-chip (SoC) simulation.” Hamid noted that while the UVM has proven effective for IP blocks and subsystems, “its limited vertical reuse and lack of links to embedded processors make it impractical for significant full-chip simulation.”

While UVM is still alive and well, his prediction is spot on in terms of the industry accepting that UVM is not the obvious choice for system-level verification. Hamid adds that “the world is clearly moving away from the UVM for full-chip simulation. In fact, in mid-year the Accellera standards body formed a proposed working group to look beyond the UVM for solutions to “portable stimulus” reusable both vertically from IP to system and horizontally from simulation to silicon.”

The one area of verification that has not fully panned out was the prediction by Ken Karnofsky, senior strategist for signal processing applications at Mathworks. He was seeing a demand for “greater integration of sensors and analog electronics into SoCs” and that it was “driving the need for system-level modeling of mixed-signal designs with workflows that connect to lower level hardware simulators.” He now sees that this is “further out than one year. The constant expansion and evolution of devices will continue to put an emphasis on wireless and networking, making them a long-term trend.” Karnofsky did note that more people are looking at combined hardware / software workflows and that they “are seeing added attention to integration and verification.”

One area, while mentioned, did not fully predict the importance that it has taken on this year. That is in the areas of power optimization and verification. “The requirement to reduce power, and specifically dynamic power, has continued to grow during 2014,” says Calypto’s McCloud. “As more users adopted RTL power analysis, they discovered that while essential, it did not solve their problem of how to create power efficient RTL. Power analysis tells you where you, but determining all the possible and effective enable conditions for clock gating is often quite difficult to do manually.”

After the torrid pace of acquisitions in 2013, the expectation was for a quieter 2014 with acquisitions concentrated in verification and IP. The Knowledge Center recorded sixteen acquisitions for 2014 and the big three all had some significant buys. Mentor acquired Nimbic and Berkeley DA and added XS Embedded and the Mecel Picea AUTOSAR Development Suite into their automotive portfolio, Synopsys went outside of EDA and bought Coverity and Cadence snapped up Forte, Jasper and some IP assets from Transwitch. On the other side of the equation, there was a single new startup created, Pollen Technology.



Leave a Reply


(Note: This name will be displayed publicly)